Skip to content

Commit

Permalink
Lab Updates
Browse files Browse the repository at this point in the history
Suffix info updated
  • Loading branch information
fabragaMS committed Jun 15, 2019
1 parent ef440d0 commit 10ddc77
Show file tree
Hide file tree
Showing 4 changed files with 18 additions and 11 deletions.
6 changes: 1 addition & 5 deletions Lab/Lab1/Lab1.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,17 +14,13 @@ Step | Description
![4](./Media/Black4.png) | Load data to an Azure SQL Data Warehouse table using Polybase
![5](./Media/Black5.png) | Visualize data from Azure SQL Data Warehouse using Power BI

**IMPORTANT**: Some of the Azure services provisioned require globally unique name and a “-suffix” has been added to their names to ensure this uniqueness. Please take note of the suffix generated as you will need it for the following resources:
**IMPORTANT**: Some of the Azure services provisioned require globally unique name and a “-suffix” has been added to their names to ensure this uniqueness. Please take note of the suffix generated as you will need it for the following resources in this lab:

Name |Type
-----------------------------|--------------------
mdwcosmosdb-*suffix* |Cosmos DB account
MDWDataFactory-*suffix* |Data Factory (V2)
mdwdatalake*suffix* |Storage Account
MDWEventHubs-*suffix* |Event Hubs Namespace
MDWKeyVault-*suffix* |Key vault
mdwsqlvirtualserver-*suffix* |SQL server
MDWStreamAnalytics-*suffix* |Stream Analytics job

## Connect to MDWDesktop
In this section you are going to establish a Remote Desktop Connection to MDWDesktop virtual machine.
Expand Down
7 changes: 1 addition & 6 deletions Lab/Lab2/Lab2.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,20 +13,15 @@ Step | Description
![](./Media/Green3.png) | Use Polybase to load data into staging tables in your Azure SQL Data Warehouse. Call a Stored Procedure to perform data aggregations and save results in the final table.
![](./Media/Green4.png) | Visualize data from your Azure SQL Data Warehouse using Power BI

**IMPORTANT**: Some of the Azure services provisioned require globally unique name and a “-suffix” has been added to their names to ensure this uniqueness. Please take note of the suffix generated as you will need it for the following resources:
**IMPORTANT**: Some of the Azure services provisioned require globally unique name and a “-suffix” has been added to their names to ensure this uniqueness. Please take note of the suffix generated as you will need it for the following resources in this lab:

Name |Type
-----------------------------|--------------------
mdwcosmosdb-*suffix* |Cosmos DB account
MDWDataFactory-*suffix* |Data Factory (V2)
mdwdatalake*suffix* |Storage Account
MDWEventHubs-*suffix* |Event Hubs Namespace
MDWKeyVault-*suffix* |Key vault
mdwsqlvirtualserver-*suffix* |SQL server
MDWStreamAnalytics-*suffix* |Stream Analytics job



## Create Azure SQL Data Warehouse database objects
In this section you will connect to Azure SQL Data Warehouse to create the database objects used to host and process data.

Expand Down
7 changes: 7 additions & 0 deletions Lab/Lab3/Lab3.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,13 @@ Step | Description
-------- | -----
![](./Media/Red1.png) |Build an Azure Databricks notebook to explore the data files you saved in your data lake in the previous exercise. You will use Python and SQL commands to open a connection to your data lake and query data from data files.

**IMPORTANT**: Some of the Azure services provisioned require globally unique name and a “-suffix” has been added to their names to ensure this uniqueness. Please take note of the suffix generated as you will need it for the following resources:

Name |Type
-----------------------------|--------------------
mdwdatalake*suffix* |Storage Account
MDWDatabricks-*suffix* |Databricks Workspace

## Create Azure Databricks Cluster
In this section you are going to create an Azure Databricks cluster that will be used to execute notebooks.

Expand Down
9 changes: 9 additions & 0 deletions Lab/Lab4/Lab4.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,15 @@ Step | Description
![](./Media/Blue5.png) | Copy metadata JSON documents into your Cosmos DB database
![](./Media/Blue6.png) | Visualize images and associated metadata using Power BI

**IMPORTANT**: Some of the Azure services provisioned require globally unique name and a “-suffix” has been added to their names to ensure this uniqueness. Please take note of the suffix generated as you will need it for the following resources:

Name |Type
-----------------------------|--------------------
mdwcosmosdb-*suffix* |Cosmos DB account
MDWDataFactory-*suffix* |Data Factory (V2)
mdwdatalake*suffix* |Storage Account
MDWDatabricks-*suffix* |Databricks Workspace

## Create NYCImages and NYCImageMetadata Containers in Azure Blob Storage
In this section you will create a container in your MDWDataLake that will be used as a repository for the NYC image files. You will copy 30 files from the MDWResources Storage Account into your NYCTaxiData container.

Expand Down

0 comments on commit 10ddc77

Please sign in to comment.