This template creates a data factory of version 2 with a pipeline that copies data from a folder in an Azure Blob Storage to a table in an Azure Database for PostgreSQL.
- Azure subscription. If you don't have an Azure subscription, create a free Azure account before you begin.
- Azure storage account. You use Blob storage as a source data store. If you don't have a storage account, see Create an Azure storage account for steps to create one.
- Azure Database for PostgreSQL. You use the database as a sink data store. If you don't have a PostgreSQL database, see Create a PostgreSQL database for steps to create one.
When you deploy this Azure Resource Manager template, a data factory of version 2 is created with the following entities:
- Azure Storage linked service
- Azure Database for PostgreSQL linked service
- Azure Blob input datasets
- Azure Database for PostgreSQL output dataset
- Pipeline with a copy activity
- Click the Deployment succeeded message.
- Click Go to resource group.
- Search for *datafactory that's created.
The following sections provide steps for running and monitoring the pipeline. For more information, see Quickstart: Create a data factory by using Azure PowerShell.
After you deploy the template, to run and monitor the pipeline, do the following steps:
-
Download runmonitor.ps1 to a folder on your machine.
-
Launch Azure PowerShell.
-
Run the following command to log in to Azure.
Login-AzureRmAccount
-
Switch to the folder where you copied the script file.
-
Run the following command to log in to Azure after specifying the names of your Azure resource group and the data factory.
.\runmonitor.ps1 -resourceGroupName "<name of your resource group>" -DataFactoryName "<name of your data factory>"