Skip to content

Commit

Permalink
ADF Screenshot Updates
Browse files Browse the repository at this point in the history
Multipple screenshot updates to Lab1,Lab2 and Lab4 to reflect recent changes in the ADF UI.
  • Loading branch information
fabragaMS committed Jun 11, 2020
1 parent 54cfe4e commit 92d9e92
Show file tree
Hide file tree
Showing 13 changed files with 17 additions and 18 deletions.
4 changes: 2 additions & 2 deletions Lab/Lab1/Lab1.md
Original file line number Diff line number Diff line change
Expand Up @@ -177,7 +177,7 @@ In this section you will build an Azure Data Factory pipeline to copy a table fr
![](./Media/Lab1-Image55.png)


3. In the **Azure Data Factory** portal and click the **Author *(pencil icon)*** option on the left-hand side panel. Under **Connections** tab, click **Linked Services** and then click **+ New** to create a new linked service connection.
3. In the **Azure Data Factory** portal and click the **Manage *(toolcase icon)*** option on the left-hand side panel. Under **Linked services** menu item, click **+ New** to create a new linked service connection.

![](./Media/Lab1-Image29.png)

Expand Down Expand Up @@ -336,7 +336,7 @@ In this section you will build an Azure Data Factory pipeline to copy a table fr
-------------|
**Execute these steps on your host computer**|

1. Open the **Azure Data Factory** portal and click the **Author *(pencil icon)*** option on the left-hand side panel. Under **Factory Resources** tab, click the ellipsis **(…)** next to **Pipelines** and then click **Add Pipeline** to create a new pipeline.
1. Open the **Azure Data Factory** portal and click the **Author *(pencil icon)*** option on the left-hand side panel. Under **Factory Resources** tab, click the ellipsis **(…)** next to **Pipelines** and then click **New Pipeline** to create a new pipeline.
2. On the **New Pipeline** tab, enter the following details:
<br>- **General > Name**: Lab1 - Copy Collision Data
3. Leave remaining fields with default values.
Expand Down
Binary file modified Lab/Lab1/Media/Lab1-Image29.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified Lab/Lab1/Media/Lab1-Image36.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified Lab/Lab1/Media/Lab1-Image37.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified Lab/Lab1/Media/Lab1-Image43.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
23 changes: 11 additions & 12 deletions Lab/Lab2/Lab2.md
Original file line number Diff line number Diff line change
Expand Up @@ -106,7 +106,7 @@ In this section you will create a linked service connection to a shared storage
-------------|
**Execute these steps on your host computer**|

1. Open the **Azure Data Factory** portal and click the **Author option *(pencil icon)*** on the left-hand side panel. Under **Connections** tab, click **Linked Services** and then click **+ New** to create a new linked service connection.
1. Open the **Azure Data Factory** portal and click the **Manage option *(toolcase icon)*** on the left-hand side panel. Under **Linked services** menu item, click **+ New** to create a new linked service connection.

![](./Media/Lab2-Image09.png)

Expand All @@ -129,14 +129,14 @@ In this section you will create a linked service connection to a shared storage
## Create Source and Destination Data Sets
In this section you are going to create 5 datasets that will be used by your data pipeline:

Dataset |Role | Description
--------|---------------|----------------
**MDWResources_NYCTaxiData_Binary**| Source |References MDWResources shared storage account container that contains source NYC Taxi data files.
**SynapseDataLake_NYCTaxiData_Binary**| Destination |References your synapsedatalake-*suffix* storage account. It acts as the destination for the NYC Taxi data files copied from MDWResources_NYCTaxiData_Binary.
**NYCDataSets_NYCTaxiLocationLookup**| Source |References [NYC].[TaxiLocationLookup] table on the NYCDataSets database. This table contains records with all taxi location codes and names.
**SynapseDW_NYCTaxiLocationLookup**| Destination| References the destination table [NYC].[TaxiLocationLookup] in the Azure Synapse Analytics data warehouse SynapseDW and acts as destination of lookup data copied from NYCDataSets_NYCTaxiLookup.
**SynapseDataLake_NYCTaxiData_CSV**| Source | References the NYCTaxiData-Raw container in your SynapseDataLake-*suffix* storage account. It functions as a data source for the Mapping Data Flow.
**SynapseDW_NYCTaxiDataSummary**| Destination | References the table [NYC].[TaxiDataSummary] in the Azure Synapse Analytics and acts as the destination for the summary data generated by your Mapping Data Flow.
Dataset |Role |Linked Service| Description|
--------|---------------|----------------|---------------
**MDWResources_NYCTaxiData_Binary**| Source |MDWResources|References MDWResources shared storage account container that contains source NYC Taxi data files.
**SynapseDataLake_NYCTaxiData_Binary**| Destination |synapsedatalake|References your synapsedatalake-*suffix* storage account. It acts as the destination for the NYC Taxi data files copied from MDWResources_NYCTaxiData_Binary.
**NYCDataSets_NYCTaxiLocationLookup**| Source |OperationalSQL_NYCDataSets|References [NYC].[TaxiLocationLookup] table on the NYCDataSets database. This table contains records with all taxi location codes and names.
**SynapseDW_NYCTaxiLocationLookup**| Destination|SynapseSQL_SynapseDW|References the destination table [NYC].[TaxiLocationLookup] in the Azure Synapse Analytics data warehouse SynapseDW and acts as destination of lookup data copied from NYCDataSets_NYCTaxiLookup.
**SynapseDataLake_NYCTaxiData_CSV**| Source |synapsedatalake| References the NYCTaxiData-Raw container in your SynapseDataLake-*suffix* storage account. It functions as a data source for the Mapping Data Flow.
**SynapseDW_NYCTaxiDataSummary**|SynapseSQL_SynapseDW| Destination | References the table [NYC].[TaxiDataSummary] in the Azure Synapse Analytics and acts as the destination for the summary data generated by your Mapping Data Flow.

**IMPORTANT**|
-------------|
Expand Down Expand Up @@ -567,11 +567,11 @@ In this section you are going to create an integration runtime for Mapping Data
-------------|
**Execute these steps on your host computer**|

1. On the Azure Data Factory portal and click the **Author option *(pencil icon)*** on the left-hand side panel. Under **Connections** tab, click the **Integration runtimes** tab and then click **+ New** to create a new integration runtime.
1. On the Azure Data Factory portal and click the **Manage *(toolcase icon)*** on the left-hand side panel. Under **Integration runtimes** tab, click **+ New** to create a new integration runtime.

![](./Media/Lab2-Image70.png)

2. On the **Integration runtime setup** blade, select **Perform data movement and dispatch to external computes** and click **Continue**.
2. On the **Integration runtime setup** blade, select **Azure, Self-Hosted** and click **Continue**.

![](./Media/Lab2-Image71.png)

Expand All @@ -593,7 +593,6 @@ In this section you are going to create an integration runtime for Mapping Data
## Create a Mapping Data Flow
In this section you are going to create a Mapping Data Flow that will transform the Taxi detailed records into an aggreated daily summary. The Mapping Data Flow will read all records from the files stored in your SynapseDataLake account and apply a sequence of transformations before the aggregated summary can be saved into the NYC.TaxiDataSummary table in your Azure Synapse Analytics.


**IMPORTANT**|
-------------|
**Execute these steps on your host computer**|
Expand Down
Binary file modified Lab/Lab2/Media/Lab2-Image09.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified Lab/Lab2/Media/Lab2-Image12.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified Lab/Lab2/Media/Lab2-Image70.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified Lab/Lab2/Media/Lab2-Image71.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified Lab/Lab2/Media/Lab2-Image72.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
8 changes: 4 additions & 4 deletions Lab/Lab4/Lab4.md
Original file line number Diff line number Diff line change
Expand Up @@ -166,13 +166,13 @@ In this section you will create a Databricks linked service in Azure Data Factor

![](./Media/Lab4-Image27.png)

7. Open the **Azure Data Factory portal** and click the **Author *(pencil icon)*** option on the left-hand side panel. Under **Connections** tab, click **Linked Services** and then click **+ New** to create a new linked service connection.
7. Open the **Azure Data Factory portal** and click the **Manage *(toolcase icon)*** option on the left-hand side panel. Under **Liked services** menu item, click **Linked Services** and then click **+ New** to create a new linked service connection.

![](./Media/Lab4-Image28.png)

8. On the **New Linked Service** blade, click the **Compute** tab.

9. Type “Azure Databricks” in the search box to find the **Azure Databricks** linked service.
9. Type “Azure Databricks” in the search box to find the **Azure Databricks** linked service.

10. Click **Continue**.

Expand Down Expand Up @@ -205,13 +205,13 @@ In this section you will create a CosmosDB linked service in Azure Data Factory.
-------------|
**Execute these steps on your host computer**|

1. Open the Azure Data Factory portal and click the **Author *(pencil icon)*** option on the left-hand side panel. Under **Connections** tab, click **Linked Services** and then click **+ New** to create a new linked service connection.
1. Open the Azure Data Factory portal and click the **Manage *(toolcase icon)*** option on the left-hand side panel. Under **Linked services** menu item, click **+ New** to create a new linked service connection.

![](./Media/Lab4-Image28.png)

2. On the **New Linked Service** blade, click the **Data Store** tab.

3. Type “Cosmos DB” in the search box to find the **Azure Cosmos DB (SQL API)** linked service.
3. Type “Cosmos DB” in the search box to find the **Azure Cosmos DB (SQL API)** linked service.

4. Click **Continue**.

Expand Down
Binary file modified Lab/Lab4/Media/Lab4-Image28.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.

0 comments on commit 92d9e92

Please sign in to comment.