title | description | services | documentationcenter | author | manager | ms.reviewer | ms.service | ms.workload | ms.tgt_pltfrm | ms.devlang | ms.topic | ms.date | ms.author |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Create an Azure data factory using REST API | Microsoft Docs |
Create an Azure data factory to copy data from one location in Azure Blob storage to another location. |
data-factory |
linda33wj |
craigg |
douglasl |
data-factory |
data-services |
rest-api |
quickstart |
01/22/2018 |
jingwang |
[!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
Azure Data Factory is a cloud-based data integration service that allows you to create data-driven workflows in the cloud for orchestrating and automating data movement and data transformation. Using Azure Data Factory, you can create and schedule data-driven workflows (called pipelines) that can ingest data from disparate data stores, process/transform the data by using compute services such as Azure HDInsight Hadoop, Spark, Azure Data Lake Analytics, and Azure Machine Learning, and publish output data to data stores such as Azure SQL Data Warehouse for business intelligence (BI) applications to consume.
This quickstart describes how to use REST API to create an Azure data factory. The pipeline in this data factory copies data from one location to another location in an Azure blob storage.
If you don't have an Azure subscription, create a free account before you begin.
- Azure subscription. If you don't have a subscription, you can create a free trial account.
- Azure Storage account. You use the blob storage as source and sink data store. If you don't have an Azure storage account, see the Create a storage account article for steps to create one.
- Create a blob container in Blob Storage, create an input folder in the container, and upload some files to the folder. You can use tools such as Azure Storage explorer to connect to Azure Blob storage, create a blob container, upload input file, and verify the output file.
- Install Azure PowerShell. Follow the instructions in How to install and configure Azure PowerShell. This quickstart uses PowerShell to invoke REST API calls.
- Create an application in Azure Active Directory following this instruction. Make note of the following values that you use in later steps: application ID, authentication key, and tenant ID. Assign application to "Contributor" role.
-
Launch PowerShell. Keep Azure PowerShell open until the end of this quickstart. If you close and reopen, you need to run the commands again.
Run the following command, and enter the user name and password that you use to sign in to the Azure portal:
Connect-AzureRmAccount
Run the following command to view all the subscriptions for this account:
Get-AzureRmSubscription
Run the following command to select the subscription that you want to work with. Replace SubscriptionId with the ID of your Azure subscription:
Select-AzureRmSubscription -SubscriptionId "<SubscriptionId>"
-
Run the following commands after replacing the places-holders with your own values, to set global variables to be used in later steps.
$tenantID = "<your tenant ID>" $appId = "<your application ID>" $authKey = "<your authentication key for the application>" $subsId = "<your subscription ID to create the factory>" $resourceGroup = "<your resource group to create the factory>" $dataFactoryName = "<specify the name of data factory to create. It must be globally unique.>" $apiVersion = "2017-09-01-preview"
Run the following commands to authenticate with Azure Active Directory (AAD):
$AuthContext = [Microsoft.IdentityModel.Clients.ActiveDirectory.AuthenticationContext]"https://login.microsoftonline.com/${tenantId}"
$cred = New-Object -TypeName Microsoft.IdentityModel.Clients.ActiveDirectory.ClientCredential -ArgumentList ($appId, $authKey)
$result = $AuthContext.AcquireToken("https://management.core.windows.net/", $cred)
$authHeader = @{
'Content-Type'='application/json'
'Accept'='application/json'
'Authorization'=$result.CreateAuthorizationHeader()
}
Run the following commands to create a data factory:
$request = "https://management.azure.com/subscriptions/${subsId}/resourceGroups/${resourceGroup}/providers/Microsoft.DataFactory/factories/${dataFactoryName}?api-version=${apiVersion}"
$body = @"
{
"name": "$dataFactoryName",
"location": "East US",
"properties": {},
"identity": {
"type": "SystemAssigned"
}
}
"@
$response = Invoke-RestMethod -Method PUT -Uri $request -Header $authHeader -Body $body
$response | ConvertTo-Json
Note the following points:
-
The name of the Azure data factory must be globally unique. If you receive the following error, change the name and try again.
Data factory name "ADFv2QuickStartDataFactory" is not available.
-
For a list of Azure regions in which Data Factory is currently available, select the regions that interest you on the following page, and then expand Analytics to locate Data Factory: Products available by region. The data stores (Azure Storage, Azure SQL Database, etc.) and computes (HDInsight, etc.) used by data factory can be in other regions.
Here is the sample response:
{
"name": "<dataFactoryName>",
"tags": {
},
"properties": {
"provisioningState": "Succeeded",
"loggingStorageAccountKey": "**********",
"createTime": "2017-09-14T06:22:59.9106216Z",
"version": "2017-09-01-preview"
},
"identity": {
"type": "SystemAssigned",
"principalId": "<service principal ID>",
"tenantId": "<tenant ID>"
},
"id": "dataFactoryName",
"type": "Microsoft.DataFactory/factories",
"location": "East US"
}
You create linked services in a data factory to link your data stores and compute services to the data factory. In this quickstart, you only need create one Azure Storage linked service as both copy source and sink store, named "AzureStorageLinkedService" in the sample.
Run the following commands to create a linked service named AzureStorageLinkedService:
Replace <accountName> and <accountKey> with name and key of your Azure storage account before executing the commands.
$request = "https://management.azure.com/subscriptions/${subsId}/resourceGroups/${resourceGroup}/providers/Microsoft.DataFactory/factories/${dataFactoryName}/linkedservices/AzureStorageLinkedService?api-version=${apiVersion}"
$body = @"
{
"name": "AzureStorageLinkedService",
"properties": {
"type": "AzureStorage",
"typeProperties": {
"connectionString": {
"value": "DefaultEndpointsProtocol=https;AccountName=<accountName>;AccountKey=<accountKey>",
"type": "SecureString"
}
}
}
}
"@
$response = Invoke-RestMethod -Method PUT -Uri $request -Header $authHeader -Body $body
$response | ConvertTo-Json
Here is the sample output:
{
"id": "/subscriptions/<subscriptionId>/resourceGroups/<resourceGroupName>/providers/Microsoft.DataFactory/factories/<dataFactoryName>/linkedservices/AzureStorageLinkedService",
"name": "AzureStorageLinkedService",
"properties": {
"type": "AzureStorage",
"typeProperties": {
"connectionString": "@{value=**********; type=SecureString}"
}
},
"etag": "0000c552-0000-0000-0000-59b1459c0000"
}
You define a dataset that represents the data to copy from a source to a sink. In this example, this Blob dataset refers to the Azure Storage linked service you create in the previous step. The dataset takes a parameter whose value is set in an activity that consumes the dataset. The parameter is used to construct the "folderPath" pointing to where the data resides/stored.
$request = "https://management.azure.com/subscriptions/${subsId}/resourceGroups/${resourceGroup}/providers/Microsoft.DataFactory/factories/${dataFactoryName}/datasets/BlobDataset?api-version=${apiVersion}"
$body = @"
{
"name": "BlobDataset",
"properties": {
"type": "AzureBlob",
"typeProperties": {
"folderPath": {
"value": "@{dataset().path}",
"type": "Expression"
}
},
"linkedServiceName": {
"referenceName": "AzureStorageLinkedService",
"type": "LinkedServiceReference"
},
"parameters": {
"path": {
"type": "String"
}
}
}
}
"@
$response = Invoke-RestMethod -Method PUT -Uri $request -Header $authHeader -Body $body
$response | ConvertTo-Json
Here is the sample output:
{
"id": "/subscriptions/<subscriptionId>/resourceGroups/<resourceGroupName>/providers/Microsoft.DataFactory/factories/<dataFactoryName>/datasets/BlobDataset",
"name": "BlobDataset",
"properties": {
"type": "AzureBlob",
"typeProperties": {
"folderPath": "@{value=@{dataset().path}; type=Expression}"
},
"linkedServiceName": {
"referenceName": "AzureStorageLinkedService",
"type": "LinkedServiceReference"
},
"parameters": {
"path": "@{type=String}"
}
},
"etag": "0000c752-0000-0000-0000-59b1459d0000"
}
In this example, this pipeline contains one activity and takes two parameters - input blob path and output blob path. The values for these parameters are set when the pipeline is triggered/run. The copy activity refers to the same blob dataset created in the previous step as input and output. When the dataset is used as an input dataset, input path is specified. And, when the dataset is used as an output dataset, the output path is specified.
$request = "https://management.azure.com/subscriptions/${subsId}/resourceGroups/${resourceGroup}/providers/Microsoft.DataFactory/factories/${dataFactoryName}/pipelines/Adfv2QuickStartPipeline?api-version=${apiVersion}"
$body = @"
{
"name": "Adfv2QuickStartPipeline",
"properties": {
"activities": [
{
"name": "CopyFromBlobToBlob",
"type": "Copy",
"inputs": [
{
"referenceName": "BlobDataset",
"parameters": {
"path": "@pipeline().parameters.inputPath"
},
"type": "DatasetReference"
}
],
"outputs": [
{
"referenceName": "BlobDataset",
"parameters": {
"path": "@pipeline().parameters.outputPath"
},
"type": "DatasetReference"
}
],
"typeProperties": {
"source": {
"type": "BlobSource"
},
"sink": {
"type": "BlobSink"
}
}
}
],
"parameters": {
"inputPath": {
"type": "String"
},
"outputPath": {
"type": "String"
}
}
}
}
"@
$response = Invoke-RestMethod -Method PUT -Uri $request -Header $authHeader -Body $body
$response | ConvertTo-Json
Here is the sample output:
{
"id": "/subscriptions/<subscriptionId>/resourceGroups/<resourceGroupName>/providers/Microsoft.DataFactory/factories/<dataFactoryName>/pipelines/Adfv2QuickStartPipeline",
"name": "Adfv2QuickStartPipeline",
"properties": {
"activities": [
"@{name=CopyFromBlobToBlob; type=Copy; inputs=System.Object[]; outputs=System.Object[]; typeProperties=}"
],
"parameters": {
"inputPath": "@{type=String}",
"outputPath": "@{type=String}"
}
},
"etag": "0000c852-0000-0000-0000-59b1459e0000"
}
In this step, you set values of inputPath and outputPath parameters specified in pipeline with the actual values of source and sink blob paths, and trigger a pipeline run. The pipeline run ID returned in the response body is used in later monitoring API.
Replace value of inputPath and outputPath with your source and sink blob path to copy data from and to before saving the file.
$request = "https://management.azure.com/subscriptions/${subsId}/resourceGroups/${resourceGroup}/providers/Microsoft.DataFactory/factories/${dataFactoryName}/pipelines/Adfv2QuickStartPipeline/createRun?api-version=${apiVersion}"
$body = @"
{
"inputPath": "<the path to existing blob(s) to copy data from, e.g. containername/path>",
"outputPath": "<the blob path to copy data to, e.g. containername/path>"
}
"@
$response = Invoke-RestMethod -Method POST -Uri $request -Header $authHeader -Body $body
$response | ConvertTo-Json
$runId = $response.runId
Here is the sample output:
{
"runId": "2f26be35-c112-43fa-9eaa-8ba93ea57881"
}
-
Run the following script to continuously check the pipeline run status until it finishes copying the data.
$request = "https://management.azure.com/subscriptions/${subsId}/resourceGroups/${resourceGroup}/providers/Microsoft.DataFactory/factories/${dataFactoryName}/pipelineruns/${runId}?api-version=${apiVersion}" while ($True) { $response = Invoke-RestMethod -Method GET -Uri $request -Header $authHeader Write-Host "Pipeline run status: " $response.Status -foregroundcolor "Yellow" if ($response.Status -eq "InProgress") { Start-Sleep -Seconds 15 } else { $response | ConvertTo-Json break } }
Here is the sample output:
{ "key": "000000000-0000-0000-0000-00000000000", "timestamp": "2017-09-07T13:12:39.5561795Z", "runId": "000000000-0000-0000-0000-000000000000", "dataFactoryName": "<dataFactoryName>", "pipelineName": "Adfv2QuickStartPipeline", "parameters": [ "inputPath: <inputBlobPath>", "outputPath: <outputBlobPath>" ], "parametersCount": 2, "parameterNames": [ "inputPath", "outputPath" ], "parameterNamesCount": 2, "parameterValues": [ "<inputBlobPath>", "<outputBlobPath>" ], "parameterValuesCount": 2, "runStart": "2017-09-07T13:12:00.3710792Z", "runEnd": "2017-09-07T13:12:39.5561795Z", "durationInMs": 39185, "status": "Succeeded", "message": "" }
-
Run the following script to retrieve copy activity run details, for example, size of the data read/written.
$request = "https://management.azure.com/subscriptions/${subsId}/resourceGroups/${resourceGroup}/providers/Microsoft.DataFactory/factories/${dataFactoryName}/pipelineruns/${runId}/activityruns?api-version=${apiVersion}&startTime="+(Get-Date).ToString('yyyy-MM-dd')+"&endTime="+(Get-Date).AddDays(1).ToString('yyyy-MM-dd')+"&pipelineName=Adfv2QuickStartPipeline" $response = Invoke-RestMethod -Method GET -Uri $request -Header $authHeader $response | ConvertTo-Json
Here is the sample output:
{ "value": [ { "id": "000000000-0000-0000-0000-00000000000", "timestamp": "2017-09-07T13:12:38.4780542Z", "pipelineRunId": "000000000-0000-00000-0000-0000000000000", "pipelineName": "Adfv2QuickStartPipeline", "status": "Succeeded", "failureType": "", "linkedServiceName": "", "activityName": "CopyFromBlobToBlob", "activityType": "Copy", "activityStart": "2017-09-07T13:12:02.3299261Z", "activityEnd": "2017-09-07T13:12:38.4780542Z", "duration": 36148, "input": "@{source=; sink=}", "output": "@{dataRead=331452208; dataWritten=331452208; copyDuration=22; throughput=14712.9; errors=System.Object[]; effectiveIntegrationRuntime=DefaultIntegrationRuntime (West US); usedDataIntegrationUnits=2; billedDuration=22}", "error": "@{errorCode=; message=; failureType=; target=CopyFromBlobToBlob}" } ] }
Use Azure Storage explorer to check the blob(s) is copied to "outputBlobPath" from "inputBlobPath" as you specified when creating a pipeline run.
You can clean up the resources that you created in the Quickstart in two ways. You can delete the Azure resource group, which includes all the resources in the resource group. If you want to keep the other resources intact, delete only the data factory you created in this tutorial.
Run the following command to delete the entire resource group:
Remove-AzureRmResourceGroup -ResourceGroupName $resourcegroupname
Run the following command to delete only the data factory:
Remove-AzureRmDataFactoryV2 -Name "<NameOfYourDataFactory>" -ResourceGroupName "<NameOfResourceGroup>"
The pipeline in this sample copies data from one location to another location in an Azure blob storage. Go through the tutorials to learn about using Data Factory in more scenarios.