Skip to content

Commit 6f200b3

Browse files
committed
started V1 -> V2 link for transformation activities
1 parent 738e53b commit 6f200b3

3 files changed

+17
-27
lines changed

articles/data-factory/tutorial-transform-data-using-spark-powershell.md

+3-1
Original file line numberDiff line numberDiff line change
@@ -261,6 +261,7 @@ You have authored linked service and pipeline definitions in JSON files. Now, le
261261
New-AzureRmDataFactoryV2Pipeline -dataFactory $df -Name $pipelineName -File "MySparkOnDemandPipeline.json"
262262
```
263263
## Start and monitor pipeline run
264+
264265
1. Start a pipeline run. It also captures the pipeline run ID for future monitoring.
265266
266267
```powershell
@@ -332,4 +333,5 @@ You have authored linked service and pipeline definitions in JSON files. Now, le
332333
Status : Succeeded
333334
Error : {errorCode, message, failureType, target}
334335
```
335-
4. Confirm that a folder named `outputfiles` is created in the `spark` folder of adftutorial container with the output from the spark program.
336+
337+
4. Confirm that a folder named `outputfiles` is created in the `spark` folder of adftutorial container with the output from the spark program.

articles/data-factory/v1/data-factory-use-custom-activities.md

+5-11
Original file line numberDiff line numberDiff line change
@@ -19,18 +19,12 @@ ms.author: spelluru
1919
robots: noindex
2020
---
2121
# Use custom activities in an Azure Data Factory pipeline
22+
> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
23+
> * [Version 1 - GA](v1/data-factory-use-custom-activities.md)
24+
> * [Version 2 - Preview](transform-data-using-dotnet-custom-activity.md)
2225
23-
> [!div class="op_single_selector" title1="Transformation Activities"]
24-
> * [Hive Activity](data-factory-hive-activity.md)
25-
> * [Pig Activity](data-factory-pig-activity.md)
26-
> * [MapReduce Activity](data-factory-map-reduce.md)
27-
> * [Hadoop Streaming Activity](data-factory-hadoop-streaming-activity.md)
28-
> * [Spark Activity](data-factory-spark.md)
29-
> * [Machine Learning Batch Execution Activity](data-factory-azure-ml-batch-execution-activity.md)
30-
> * [Machine Learning Update Resource Activity](data-factory-azure-ml-update-resource-activity.md)
31-
> * [Stored Procedure Activity](data-factory-stored-proc-activity.md)
32-
> * [Data Lake Analytics U-SQL Activity](data-factory-usql-activity.md)
33-
> * [.NET Custom Activity](data-factory-use-custom-activities.md)
26+
> [!NOTE]
27+
> This article applies to version 1 of Data Factory, which is generally available (GA). If you are using version 2 of the Data Factory service, which is in preview, see [Custom activities in V2](../transform-data-using-dotnet-custom-activity.md).
3428
3529
There are two types of activities that you can use in an Azure Data Factory pipeline.
3630

articles/data-factory/v1/data-factory-usql-activity.md

+9-15
Original file line numberDiff line numberDiff line change
@@ -19,24 +19,18 @@ ms.author: spelluru
1919
robots: noindex
2020
---
2121
# Transform data by running U-SQL scripts on Azure Data Lake Analytics
22-
> [!div class="op_single_selector" title1="Transformation Activities"]
23-
> * [Hive Activity](data-factory-hive-activity.md)
24-
> * [Pig Activity](data-factory-pig-activity.md)
25-
> * [MapReduce Activity](data-factory-map-reduce.md)
26-
> * [Hadoop Streaming Activity](data-factory-hadoop-streaming-activity.md)
27-
> * [Spark Activity](data-factory-spark.md)
28-
> * [Machine Learning Batch Execution Activity](data-factory-azure-ml-batch-execution-activity.md)
29-
> * [Machine Learning Update Resource Activity](data-factory-azure-ml-update-resource-activity.md)
30-
> * [Stored Procedure Activity](data-factory-stored-proc-activity.md)
31-
> * [Data Lake Analytics U-SQL Activity](data-factory-usql-activity.md)
32-
> * [.NET Custom Activity](data-factory-use-custom-activities.md)
22+
> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
23+
> * [Version 1 - GA](v1/data-factory-usql-activity.md)
24+
> * [Version 2 - Preview](transform-data-using-data-lake-analytics.md)
25+
26+
> [!NOTE]
27+
> This article applies to version 1 of Data Factory, which is generally available (GA). If you are using version 2 of the Data Factory service, which is in preview, see [U-SQL Activity in V2](../transform-data-using-data-lake-analytics.md).
3328
3429
A pipeline in an Azure data factory processes data in linked storage services by using linked compute services. It contains a sequence of activities where each activity performs a specific processing operation. This article describes the **Data Lake Analytics U-SQL Activity** that runs a **U-SQL** script on an **Azure Data Lake Analytics** compute linked service.
3530

36-
> [!NOTE]
37-
> Create an Azure Data Lake Analytics account before creating a pipeline with a Data Lake Analytics U-SQL Activity. To learn about Azure Data Lake Analytics, see [Get started with Azure Data Lake Analytics](../../data-lake-analytics/data-lake-analytics-get-started-portal.md).
38-
>
39-
> Review the [Build your first pipeline tutorial](data-factory-build-your-first-pipeline.md) for detailed steps to create a data factory, linked services, datasets, and a pipeline. Use JSON snippets with Data Factory Editor or Visual Studio or Azure PowerShell to create Data Factory entities.
31+
Create an Azure Data Lake Analytics account before creating a pipeline with a Data Lake Analytics U-SQL Activity. To learn about Azure Data Lake Analytics, see [Get started with Azure Data Lake Analytics](../../data-lake-analytics/data-lake-analytics-get-started-portal.md).
32+
33+
Review the [Build your first pipeline tutorial](data-factory-build-your-first-pipeline.md) for detailed steps to create a data factory, linked services, datasets, and a pipeline. Use JSON snippets with Data Factory Editor or Visual Studio or Azure PowerShell to create Data Factory entities.
4034

4135
## Supported authentication types
4236
U-SQL activity supports below authentication types against Data Lake Analytics:

0 commit comments

Comments
 (0)