forked from solliancenet/tech-immersion-data-ai
-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathreadme.html
641 lines (641 loc) · 79.1 KB
/
readme.html
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
<h1 id="data-ai-tech-immersion-workshop-product-review-guide-and-lab-instructions">Data & AI Tech Immersion Workshop – Product Review Guide and Lab Instructions</h1>
<h2 id="day-1-experience-2---leveraging-cosmos-db-for-near-real-time-analytics">Day 1, Experience 2 - Leveraging Cosmos DB for near real-time analytics</h2>
<ul>
<li><a href="#data--ai-tech-immersion-workshop-%E2%80%93-product-review-guide-and-lab-instructions">Data & AI Tech Immersion Workshop – Product Review Guide and Lab Instructions</a>
<ul>
<li><a href="#day-1-experience-2---leveraging-cosmos-db-for-near-real-time-analytics">Day 1, Experience 2 - Leveraging Cosmos DB for near real-time analytics</a></li>
<li><a href="#technology-overview">Technology overview</a>
<ul>
<li><a href="#azure-cosmos-db">Azure Cosmos DB</a></li>
<li><a href="#azure-functions">Azure Functions</a></li>
<li><a href="#azure-stream-analytics">Azure Stream Analytics</a></li>
<li><a href="#power-bi">Power BI</a></li>
<li><a href="#serverless-computing-using-azure-cosmos-db-and-azure-functions">Serverless computing using Azure Cosmos DB and Azure Functions</a></li>
</ul></li>
<li><a href="#scenario-overview">Scenario overview</a></li>
<li><a href="#experience-requirements">Experience requirements</a></li>
<li><a href="#task-1-configure-cosmos-db">Task 1: Configure Cosmos DB</a></li>
<li><a href="#task-2-configure-event-hubs">Task 2: Configure Event Hubs</a></li>
<li><a href="#task-3-configure-stream-analytics">Task 3: Configure Stream Analytics</a></li>
<li><a href="#task-4-configure-azure-function-app">Task 4: Configure Azure Function App</a></li>
<li><a href="#task-5-publish-function-app-and-run-data-generator">Task 5: Publish Function App and run data generator</a></li>
<li><a href="#task-6-view-published-function">Task 6: View published function</a></li>
<li><a href="#task-7-create-power-bi-dashboard">Task 7: Create Power BI dashboard</a></li>
<li><a href="#wrap-up">Wrap-up</a></li>
<li><a href="#additional-resources-and-more-information">Additional resources and more information</a></li>
</ul></li>
</ul>
<h2 id="technology-overview">Technology overview</h2>
<h3 id="azure-cosmos-db">Azure Cosmos DB</h3>
<p>Develop high-concurrency, low-latency applications with Azure Cosmos DB, a fully managed database service that supports NoSQL APIs and can scale out <a href="https://docs.microsoft.com/en-us/azure/cosmos-db/how-to-multi-master">multi-master</a> workloads anywhere in the world. Ensure blazing fast performance with <a href="https://azure.microsoft.com/en-us/support/legal/sla/cosmos-db/v1_2/">industry-leading service level agreements (SLAs)</a> for single-digit-millisecond reads and writes, data consistency and throughput, and 99.999% high availability. Transparent <a href="https://docs.microsoft.com/en-us/azure/cosmos-db/partitioning-overview">horizontally-partitioning</a> provides elastic scaling, matching capacity with demand to controls costs and ensures your applications maintains high performance during peak traffic.</p>
<p>Azure Cosmos DB offers built-in, cloud-native capabilities to simplify app development and boost developer productivity, including five well-defined consistency models, <a href="https://docs.microsoft.com/en-us/azure/cosmos-db/index-policy">auto-indexing</a>, and multiple data models. Easily migrate existing NoSQL data with open-source APIs for <a href="https://docs.microsoft.com/en-us/azure/cosmos-db/mongodb-introduction">MongoDB</a>, <a href="https://docs.microsoft.com/en-us/azure/cosmos-db/cassandra-introduction">Cassandra</a>, Gremlin (Graph), and others. Developers can work with tools to build microservices and the languages of their choice, while enjoying seamless integration with Azure services for IoT, advanced analytics, AI and machine learning, and business intelligence.</p>
<p>Azure Cosmos DB enables you to innovate with IoT data to build enhanced user experiences and turn insights into action:</p>
<ul>
<li>Ingest and query diverse IoT data easily using Azure Cosmos DB’s global presence to capture data from anywhere.</li>
<li>Scale elastically to accommodate real-time fluctuations in IoT data.</li>
<li>Seamlessly integrate into tools like Azure Event Hub, Azure IoT Hub and Azure Functions to ingest and stream data.</li>
</ul>
<p>Performing real-time analytics on data of any size or type from anywhere, using a Lambda architecture, and easy integration with Azure Databricks.</p>
<ul>
<li>Source and serve data quickly through integration with other Azure services for real-time insights.</li>
<li>Run in-depth queries over diverse data sets to understand trends and make better decisions.</li>
<li>Apply Analytics, Machine Learning, and Cognitive capabilities to your NoSQL data.</li>
</ul>
<p>By using Azure Cosmos DB you no longer have to make the extreme <a href="https://docs.microsoft.com/en-us/azure/cosmos-db/consistency-levels-tradeoffs">tradeoffs</a> between consistency, availability, latency and programmability. You can choose from five well-defined consistency choices (strong, bounded staleness, consistent-prefix, session, and eventual) to better control your user’s experience through consistency, availability, latency and programmability.</p>
<p>Focus your time and attention on developing great apps while Azure handles management and optimization of infrastructure and databases. Deploy databases in a fraction of the time on Microsoft’s platform as a service and leverage built-in configuration options to get up and running fast. You can rest assured your apps are running on a fully managed database service built on world-class infrastructure with enterprise-grade security and compliance</p>
<h3 id="azure-functions">Azure Functions</h3>
<p><a href="https://docs.microsoft.com/en-us/azure/azure-functions/functions-overview">Azure Functions</a> enables you to easily build the apps you need using simple, serverless functions that <a href="https://docs.microsoft.com/en-us/azure/azure-functions/functions-scale">scale</a> to meet demand.</p>
<p>Azure Functions allows you to focus on running great apps, instead of the infrastructure on which they run. You don’t need to worry about provisioning and maintaining servers. Azure Functions provides a fully managed compute platform with high reliability and security. With <a href="https://docs.microsoft.com/en-us/azure/azure-functions/functions-scale">scale</a> on demand, your code gets the compute resources it needs, when it needs them, freeing you of capacity planning concerns.</p>
<p>Write code only for what truly matters to your business. Utilize innovative programming model for everything else such as <a href="https://docs.microsoft.com/en-us/azure/azure-functions/functions-scale">communicating with other services</a>, building <a href="https://docs.microsoft.com/en-us/azure/azure-functions/functions-bindings-http-webhook">HTTP-based API</a> or orchestrating complex workflows. Azure Functions naturally leads you to a microservices-friendly approach for building more scalable and stable applications.</p>
<p>You can create Functions in the <a href="https://docs.microsoft.com/en-us/azure/azure-functions/supported-languages">programming language of your choice</a>. Write code in an easy-to-use web-based interface or build and debug on your local machine with your favorite development tool. You can take advantage of built-in continuous deployment and use integrated monitoring tools to <a href="https://docs.microsoft.com/en-us/azure/app-service/overview-diagnostics">troubleshoot issues</a>.</p>
<h3 id="azure-stream-analytics">Azure Stream Analytics</h3>
<p>As more and more data is generated from a variety of connected devices and sensors, transforming this data into actionable insights and predictions in near real-time is now an operational necessity. <a href="https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-introduction">Azure Stream Analytics</a> seamlessly integrates with your real-time application architecture to enable powerful, real-time analytics on your data no matter what the volume.</p>
<p>Azure Stream Analytics enables you to develop massively parallel Complex Event Processing (CEP) pipelines with simplicity. It allows you to author powerful, real-time analytics solutions using very simple, declarative <a href="https://docs.microsoft.com/en-us/stream-analytics-query/stream-analytics-query-language-reference">SQL like language</a> with embedded support for temporal logic. Extensive array of <a href="https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-define-outputs">out-of-the-box connectors</a>, advanced debugging and job monitoring capabilities help keep costs down by significantly lowering the developer skills required. Additionally, Azure Stream Analytics is highly extensible through support for custom code with <a href="https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-javascript-user-defined-functions">JavaScript User Defined functions</a> further extending the streaming logic written in SQL.</p>
<p>Getting started in seconds is easy with Azure Stream Analytics as there is no infrastructure to worry about, and no servers, virtual machines, or clusters to manage. You can instantly <a href="https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-streaming-unit-consumption">scale-out the processing power</a> from one to hundreds of streaming units for any job. You only pay for the processing used per job.</p>
<p><a href="https://docs.microsoft.com/en-us/stream-analytics-query/event-delivery-guarantees-azure-stream-analytics">Guaranteed event delivery</a> and an enterprise grade SLA, provide the three 9’s of availability, making sure that Azure Stream Analytics is suitable for mission critical workloads. Automated checkpoints enable fault tolerant operation with fast restarts with no data loss.</p>
<p>Azure Stream Analytics can be used to allow you to quickly build real-time dashboards with Power BI for a live command and control view. <a href="https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-power-bi-dashboard">Real-time dashboards</a> help transform live data into actionable and insightful visuals, and help you focus on what matters to you the most.</p>
<h3 id="power-bi">Power BI</h3>
<p><a href="https://docs.microsoft.com/en-us/power-bi/">Power BI</a> is a business analytics service that delivers insights to enable fast, informed decisions. Enabling you to transform data into stunning visuals and share them with colleagues on any device. Power BI provides a rich canvas on which to visually <a href="https://docs.microsoft.com/en-us/power-bi/service-basic-concepts">explore and analyze your data</a>. The ability to collaborate on and share customized <a href="https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-power-bi-dashboard">dashboards</a> and interactive reports is part of the experience, enabling you to scale across your organization with built-in governance and security.</p>
<h3 id="serverless-computing-using-azure-cosmos-db-and-azure-functions">Serverless computing using Azure Cosmos DB and Azure Functions</h3>
<figure>
<img src="media/cosmos-db-and-azure-functions.png" title="Azure Cosmos DB and Functions" alt="The diagram shows events being fed into Cosmos DB, and the change feed triggering Azure functions." /><figcaption>The diagram shows events being fed into Cosmos DB, and the change feed triggering Azure functions.</figcaption>
</figure>
<p>Serverless computing is all about the ability to focus on individual pieces of logic that are repeatable and stateless. These pieces require no infrastructure management and they consume resources only for the seconds, or milliseconds, they run for. At the core of the serverless computing movement are functions, which are made available in the Azure ecosystem by Azure Functions. To learn about other serverless execution environments in Azure see ‘serverless in Azure’ page.</p>
<p>With the native integration between <a href="https://docs.microsoft.com/en-us/azure/cosmos-db/serverless-computing-database">Azure Cosmos DB and Azure Functions</a>, you can create database triggers, input bindings, and output bindings directly from your Azure Cosmos DB account.</p>
<p>Azure Functions and Azure Cosmos DB allow you can create, deploy, and easily manage great low-latency event-driven serverless apps based on rich data, and serving a globally distributed user base seamlessly.</p>
<figure>
<img src="media/cosmos-db-azure-function.png" title="Cosmos DB and Azure Functions" alt="Use an Azure Cosmos DB trigger to invoke an Azure Function." /><figcaption>Use an Azure Cosmos DB trigger to invoke an Azure Function.</figcaption>
</figure>
<p>• For an example of event sourcing architectures based on Azure Cosmos DB in a real world use case see <a href="https://blogs.msdn.microsoft.com/azurecat/2018/05/17/azure-cosmos-db-customer-profile-jet-com">https://blogs.msdn.microsoft.com/azurecat/2018/05/17/azure-cosmos-db-customer-profile-jet-com</a></p>
<h2 id="scenario-overview">Scenario overview</h2>
<p>Contoso Auto is collecting vehicle telemetry and wants to use Cosmos DB to rapidly ingest and store the data in its raw form, then do some processing in near real-time. In the end, they want to create a dashboard that automatically updates with new data as it flows in after being processed. What they would like to see on the dashboard are various visualizations of detected anomalies, like engines overheating, abnormal oil pressure, and aggressive driving, using components such as a map to show anomalies related to cities, as well as various charts and graphs depicting this information in a clear way.</p>
<p>In this experience, you will use Azure Cosmos DB to ingest streaming vehicle telemetry data as the entry point to a near real-time analytics pipeline built on Cosmos DB, Azure Functions, Event Hubs, Azure Stream Analytics, and Power BI. To start, you will complete configuration and performance-tuning on Cosmos DB to prepare it for data ingest, and use the change feed capability of Cosmos DB to trigger Azure Functions for data processing. The function will enrich the telemetry data with location information, then send it to Event Hubs. Azure Stream Analytics extracts the enriched sensor data from Event Hubs, performs aggregations over windows of time, then sends the aggregated data to Power BI for data visualization and analysis. A vehicle telemetry data generator will be used to send vehicle telemetry data to Cosmos DB.</p>
<h2 id="experience-requirements">Experience requirements</h2>
<ul>
<li>Azure subscription</li>
<li>Visual Studio 2017 Community (or better)</li>
<li>Power BI account (sign up at <a href="https://powerbi.microsoft.com" class="uri">https://powerbi.microsoft.com</a>)</li>
</ul>
<h2 id="task-1-configure-cosmos-db">Task 1: Configure Cosmos DB</h2>
<p><a href="https://docs.microsoft.com/en-us/azure/cosmos-db/introduction">Azure Cosmos DB</a> provides a multi-model, globally available NoSQL database with high concurrency, low latency, and predictable results. One of its biggest strengths is that it transparently synchronizes data to all regions around the globe, which can quickly and easily be added at any time. This adds value by reducing the amount of development required to read and write the data and removes any need for synchronization. The speed in which Cosmos DB can ingest as well as return data, coupled with its ability to do so at a global scale, makes it ideal for both ingesting real-time data and serving that data to consumers worldwide.</p>
<p>When storing and delivering your data on a global scale, there are some things to consider. Most distributed databases offer two consistency levels: strong and eventual. These live at different ends of a spectrum, where strong consistency often results in slower transactions because it synchronously writes data to each replica set. This guarantees that the reader will always see the most recent committed version of the data. Eventual consistency, on the other hand, asynchronously writes to each replica set with no ordering guarantee for reads. The replicas eventually converge, but the risk is that it can take several reads to retrieve the most up-to-date data.</p>
<p>Azure Cosmos DB was designed with control over the tradeoffs between read consistency, availability, latency, and throughput. This is why Cosmos DB offers five consistency levels: strong, bounded staleness, session, consistent prefix, and eventual. As a general rule of thumb, you can get about 2x read throughput for session, consistent prefix, and eventual consistency models compared to bounded staleness or strong consistency.</p>
<p>The Session consistency level is the default, and is suitable for most operations. It provides strong consistency for the session (application or connection), where all reads are current with writes from that session. Data from other sessions come in the correct order, but aren’t guaranteed to be current. Session consistency level provides a balance of good performance and good availability at half the cost of both strong consistency and bounded staleness. As mentioned before, session provides about 2x read throughput compared to these two stronger consistency levels as well.</p>
<p>For this scenario, Contoso Auto does not need to ingest and serve their data globally just yet. Right now, they are working on a POC to rapidly ingest vehicle telemetry data, process that data as it arrives, and visualize the processed data through a real-time dashboard. Cosmos DB gives them the flexibility to add regions in the future either programmatically through its APIs, or through the “Replicate data globally” Cosmos DB settings in the portal.</p>
<p>To do this, go to the “Replicate data globally” settings, select the option to add a region, then choose the region you wish to add.</p>
<figure>
<img src="media/cosmos-add-region.png" title="Configure regions" alt="The Add Region button is highlighted." /><figcaption>The Add Region button is highlighted.</figcaption>
</figure>
<p>Once you are finished adding regions, simply select the Save button to apply your changes. You will see the regions highlighted on a map.</p>
<figure>
<img src="media/cosmos-region-map.png" title="Regions" alt="A map is displayed showing the regions that were added." /><figcaption>A map is displayed showing the regions that were added.</figcaption>
</figure>
<p>To ensure high write and read availability, configure your Cosmos account to span at least two regions with multiple-write regions. This configuration will provide the availability, lowest latency, and scalability for both reads and writes backed by SLAs. To learn more, see <a href="https://docs.microsoft.com/en-us/azure/cosmos-db/tutorial-global-distribution-sql-api">how to configure your Cosmos account with multiple write-regions</a>. To configure multi-master in your applications, see <a href="https://docs.microsoft.com/en-us/azure/cosmos-db/how-to-multi-master">How to configure multi-master</a>.</p>
<p>You may be wondering how you can control the throughput, or speed at which data can be written to or read from Cosmos DB at a global level. In Azure Cosmos DB, provisioned throughput is represented as request units/second (RUs). RUs measure the cost of both read and write operations against your Cosmos DB container. Because Cosmos DB is designed with transparent horizontal scaling (e.g., scale out) and multi-master replication, you can very quickly and easily increase or decrease the number of RUs to handle thousands to hundreds of millions of requests per second around the globe with a single API call.</p>
<p>Cosmos DB allows you to increment/decrement the RUs in small increments of 1000 at the database level, and in even smaller increments of 100 RU/s at the container level. It is recommended that you configure throughput at the container granularity for guaranteed performance for the container all the time, backed by SLAs. Other guarantees that Cosmos DB delivers are 99.999% read and write availability all around the world, with those reads and writes being served in less than 10 milliseconds at the 99th percentile.</p>
<p>When you set a number of RUs for a container, Cosmos DB ensures that those RUs are available in all regions associated with your Cosmos DB account. When you scale out the number of regions by adding a new one, Cosmos will automatically provision the same quantity of RUs in the newly added region. You cannot selectively assign different RUs to a specific region. These RUs are provisioned for a container (or database) for all associated regions.</p>
<p>In this task, you will create a new Cosmos DB database and collection, set the throughput units, and obtain the connection details.</p>
<ol type="1">
<li><p>To start, open a new web browser window and navigate to <a href="https://portal.azure.com" class="uri">https://portal.azure.com</a>. Log in with the credentials provided to you for this lab.</p></li>
<li><p>After logging into the Azure portal, select <strong>Resource groups</strong> from the left-hand menu. Then select the resource group named <strong>tech-immersion-YOUR_UNIQUE_IDENTIFIER</strong>. The <code>YOUR_UNIQUE_IDENTIFIER</code> portion of the name is the unique identifier assigned to you for this lab.</p>
<figure>
<img src="media/tech-immersion-rg.png" title="Resource groups" alt="The tech-immersion resource group is selected." /><figcaption>The tech-immersion resource group is selected.</figcaption>
</figure></li>
<li><p>Select the <strong>Azure Cosmos DB account</strong> from the list of resources in your resource group.</p>
<figure>
<img src="media/tech-immersion-rg-cosmos-db.png" title="tech-immersion resource group" alt="The Azure Cosmos DB account is selected in the resource group." /><figcaption>The Azure Cosmos DB account is selected in the resource group.</figcaption>
</figure></li>
<li><p>Within the Cosmos DB account blade, select <strong>Data Explorer</strong> on the left-hand menu.</p>
<figure>
<img src="media/cosmos-db-data-explorer-link.png" title="Data Explorer link" alt="The Data Explorer link located in the left-hand menu is highlighted." /><figcaption>The Data Explorer link located in the left-hand menu is highlighted.</figcaption>
</figure></li>
<li><p>If the ContosoAuto database and telemetry collection already exist, <strong>skip ahead</strong> to step 9.</p>
<figure>
<img src="media/cosmos-db-database-exists.png" title="Data Explorer" alt="Screenshot shows the database and collection already exists." /><figcaption>Screenshot shows the database and collection already exists.</figcaption>
</figure></li>
<li><p>Select <strong>New Collection</strong> in the top toolbar.</p>
<figure>
<img src="media/cosmos-db-new-collection-link.png" title="New Collection link" alt="The New Collection link in the top toolbar is highlighted." /><figcaption>The New Collection link in the top toolbar is highlighted.</figcaption>
</figure></li>
<li><p>In the <strong>Add Collection</strong> blade, configure the following:</p>
<ul>
<li><strong>Database id:</strong> Select <strong>Create new</strong>, then enter “ContosoAuto” for the id.</li>
<li><strong>Provision database throughput:</strong> Unchecked.</li>
<li><strong>Collection id:</strong> Enter “telemetry”.</li>
<li><strong>Partition key:</strong> Enter “/vin”.</li>
<li><strong>Throughput:</strong> Enter 15000.</li>
</ul>
<blockquote>
<p>The /vin partition was selected because the data will include this value, and it allows us to partition by vehicle from which the transaction originated. This field also contains a wide range of values, which is preferable for partitions.</p>
</blockquote>
<figure>
<img src="media/cosmos-db-new-collection.png" title="Add Collection" alt="The Add Collection form is filled out with the previously mentioned settings entered into the appropriate fields." /><figcaption>The Add Collection form is filled out with the previously mentioned settings entered into the appropriate fields.</figcaption>
</figure>
<p>On the subject of partitions, choosing an appropriate partition key for Cosmos DB is a critical step for ensuring balanced reads and writes, scaling, and, in this case, in-order change feed processing per partition. While there are no limits, per se, on the number of logical partitions, a single logical partition is allowed an upper limit of 10 GB of storage. Logical partitions cannot be split across physical partitions. For the same reason, if the partition key chosen is of bad cardinality, you could potentially have skewed storage distribution. For instance, if one logical partition becomes larger faster than the others and hits the maximum limit of 10 GB, while the others are nearly empty, the physical partition housing the maxed out logical partition cannot split and could cause an application downtime. This is why we specified <code>vin</code> as the partition key. It has good cardinality for this data set.</p></li>
<li><p>Select <strong>OK</strong> on the bottom of the form when you are finished entering the values.</p></li>
<li><p>Select <strong>Firewall and virtual networks</strong> from the left-hand menu and confirm that Allow access from <strong>All networks</strong> is selected. If it was not previously set to this, select <strong>Save</strong>. This will allow the vehicle telemetry generator application to send data to your Cosmos DB collection.</p>
<figure>
<img src="media/cosmos-db-firewall.png" title="Firewall and virtual networks" alt="The All networks option is selected within the Firewall and virtual networks blade." /><figcaption>The All networks option is selected within the Firewall and virtual networks blade.</figcaption>
</figure></li>
<li><p>Select <strong>Keys</strong> from the left-hand menu.</p>
<figure>
<img src="media/cosmos-db-keys-link.png" title="Keys link" alt="The Keys link on the left-hand menu is highlighted." /><figcaption>The Keys link on the left-hand menu is highlighted.</figcaption>
</figure></li>
<li><p>Copy the <strong>Primary Connection String</strong> value by selecting the copy button to the right of the field. <strong>SAVE THIS VALUE</strong> in Notepad or similar text editor for later.</p>
<figure>
<img src="media/cosmos-db-keys.png" title="Keys" alt="The Primary Connection String key is copied." /><figcaption>The Primary Connection String key is copied.</figcaption>
</figure></li>
</ol>
<h2 id="task-2-configure-event-hubs">Task 2: Configure Event Hubs</h2>
<p>Azure Event Hubs is a Big Data streaming platform and event ingestion service, capable of receiving and processing millions of events per second. We are using it to temporarily store vehicle telemetry data that is processed and ready to be sent to the real-time dashboard. As data flows into Event Hubs, Azure Stream Analytics will query the data, applying aggregates and tagging anomalies, then send it to Power BI.</p>
<p>In this task, you will create and configure a new event hub within the provided Event Hubs namespace. This will be used to capture vehicle telemetry after it has been processed and enriched by the Azure function you will create later on.</p>
<ol type="1">
<li><p>Navigate to the <a href="https://portal.azure.com">Azure portal</a>.</p></li>
<li><p>Select <strong>Resource groups</strong> from the left-hand menu. Then select the resource group named <strong>tech-immersion-YOUR_UNIQUE_IDENTIFIER</strong>.</p>
<figure>
<img src="media/tech-immersion-rg.png" title="Resource groups" alt="The tech-immersion resource group is selected." /><figcaption>The tech-immersion resource group is selected.</figcaption>
</figure></li>
<li><p>Select the <strong>Event Hubs Namespace</strong> (<code>tech-immersion-hub-YOUR_UNIQUE_ID</code>) from the list of resources in your resource group.</p>
<figure>
<img src="media/tech-immersion-rg-event-hubs.png" title="tech-immersion resource group" alt="The Event Hubs Namespace is selected in the resource group." /><figcaption>The Event Hubs Namespace is selected in the resource group.</figcaption>
</figure></li>
<li><p>Within the Event Hubs Namespace blade, select <strong>Event Hubs</strong> within the left-hand menu.</p>
<figure>
<img src="media/event-hubs-link.png" title="Event Hubs link" alt="The Event Hubs link is selected in the left-hand menu." /><figcaption>The Event Hubs link is selected in the left-hand menu.</figcaption>
</figure></li>
<li><p>Select <strong>+ Event Hub</strong> in the top toolbar to create a new event hub in the namespace.</p>
<figure>
<img src="media/event-hubs-new-event-hub-link.png" title="New event hub link" alt="The new Event Hub link is highlighted in the top toolbar." /><figcaption>The new Event Hub link is highlighted in the top toolbar.</figcaption>
</figure></li>
<li><p>In the <strong>Create Event Hub</strong> blade, configure the following:</p>
<ul>
<li><strong>Name:</strong> Enter “telemetry”.</li>
<li><strong>Partition Count:</strong> Select 2.</li>
<li><strong>Message Retention</strong>: Select 1.</li>
<li><strong>Capture:</strong> Select Off.</li>
</ul>
<figure>
<img src="media/event-hubs-create-event-hub.png" title="Create Event Hub" alt="The Create Event Hub form is filled out with the previously mentioned settings entered into the appropriate fields." /><figcaption>The Create Event Hub form is filled out with the previously mentioned settings entered into the appropriate fields.</figcaption>
</figure></li>
<li><p>Select <strong>Create</strong> on the bottom of the form when you are finished entering the values.</p></li>
<li><p>Select your newly created <strong>telemetry</strong> event hub from the list after it is created.</p>
<figure>
<img src="media/event-hubs-select.png" title="Event hubs" alt="The newly created telemetry event hub is selected." /><figcaption>The newly created telemetry event hub is selected.</figcaption>
</figure></li>
<li><p>Select <strong>Shared access policies</strong> from the left-hand menu.</p>
<figure>
<img src="media/event-hubs-shared-access-policies-link.png" title="Shared access policies link" alt="The Shared access policies link is selected in the left-hand menu." /><figcaption>The Shared access policies link is selected in the left-hand menu.</figcaption>
</figure></li>
<li><p>Select <strong>+ Add</strong> in the top toolbar to create a new shared access policy.</p>
<figure>
<img src="media/event-hubs-shared-access-policies-add-link.png" title="Add" alt="The Add button is highlighted." /><figcaption>The Add button is highlighted.</figcaption>
</figure></li>
<li><p>In the <strong>Add SAS Policy</strong> blade, configure the following:</p>
<ul>
<li><strong>Name:</strong> Enter “Read”.</li>
<li><strong>Managed:</strong> Unchecked.</li>
<li><strong>Send:</strong> Unchecked.</li>
<li><strong>Listen:</strong> Checked.</li>
</ul>
<figure>
<img src="media/event-hubs-add-sas-policy-read.png" title="Add SAS Policy" alt="The Add SAS Policy form is filled out with the previously mentioned settings entered into the appropriate fields." /><figcaption>The Add SAS Policy form is filled out with the previously mentioned settings entered into the appropriate fields.</figcaption>
</figure>
<blockquote>
<p>It is a best practice to create separate policies for reading, writing, and managing events. This follows the principle of least privilege to prevent services and applications from performing unauthorized operations.</p>
</blockquote></li>
<li><p>Select <strong>Create</strong> on the bottom of the form when you are finished entering the values.</p></li>
<li><p>Select <strong>+ Add</strong> in the top toolbar to create a new shared access policy.</p>
<figure>
<img src="media/event-hubs-shared-access-policies-add-link.png" title="Add" alt="The Add button is highlighted." /><figcaption>The Add button is highlighted.</figcaption>
</figure></li>
<li><p>In the <strong>Add SAS Policy</strong> blade, configure the following:</p>
<ul>
<li><strong>Name:</strong> Enter “Write”.</li>
<li><strong>Managed:</strong> Unchecked.</li>
<li><strong>Send:</strong> Checked.</li>
<li><strong>Listen:</strong> Unchecked.</li>
</ul>
<figure>
<img src="media/event-hubs-add-sas-policy-write.png" title="Add SAS Policy" alt="The Add SAS Policy form is filled out with the previously mentioned settings entered into the appropriate fields." /><figcaption>The Add SAS Policy form is filled out with the previously mentioned settings entered into the appropriate fields.</figcaption>
</figure></li>
<li><p>Select <strong>Create</strong> on the bottom of the form when you are finished entering the values.</p></li>
<li><p>Select your <strong>Write</strong> policy from the list. Copy the <strong>Connection string - primary key</strong> value by selecting the Copy button to the right of the field. <strong>SAVE THIS VALUE</strong> in Notepad or similar text editor for later.</p>
<figure>
<img src="media/event-hubs-write-policy-key.png" title="SAS Policy: Write" alt="The Write policy is selected and its blade displayed. The Copy button next to the Connection string - primary key field is highlighted." /><figcaption>The Write policy is selected and its blade displayed. The Copy button next to the Connection string - primary key field is highlighted.</figcaption>
</figure></li>
</ol>
<h2 id="task-3-configure-stream-analytics">Task 3: Configure Stream Analytics</h2>
<p>Azure Stream Analytics is an event-processing engine that allows you to examine high volumes of data streaming from devices. Incoming data can be from devices, sensors, web sites, social media feeds, applications, and more. It also supports extracting information from data streams, identifying patterns, and relationships. You can then use these patterns to trigger other actions downstream, such as create alerts, feed information to a reporting tool, or store it for later use.</p>
<p>In this task, you will configure Stream Analytics to use the event hub you created as a source, query and analyze that data, then send it to Power BI for reporting.</p>
<ol type="1">
<li><p>Navigate to the <a href="https://portal.azure.com">Azure portal</a>.</p></li>
<li><p>Select <strong>Resource groups</strong> from the left-hand menu. Then select the resource group named <strong>tech-immersion-YOUR_UNIQUE_IDENTIFIER</strong>.</p>
<figure>
<img src="media/tech-immersion-rg.png" title="Resource groups" alt="The tech-immersion resource group is selected." /><figcaption>The tech-immersion resource group is selected.</figcaption>
</figure></li>
<li><p>Select the <strong>Stream Analytics job</strong> (<code>tech-immersion-analytics-YOUR_UNIQUE_ID</code>) from the list of resources in your resource group.</p>
<figure>
<img src="media/tech-immersion-rg-stream-analytics.png" title="tech-immersion resource group" alt="The Stream Analytics job is selected in the resource group." /><figcaption>The Stream Analytics job is selected in the resource group.</figcaption>
</figure></li>
<li><p>Within the Stream Analytics job blade, select <strong>Inputs</strong> within the left-hand menu.</p>
<figure>
<img src="media/inputs-link.png" title="Inputs link" alt="The Inputs link is selected in the left-hand menu." /><figcaption>The Inputs link is selected in the left-hand menu.</figcaption>
</figure></li>
<li><p>Select <strong>+ Add stream input</strong> in the top toolbar, then select <strong>Event Hub</strong> to create a new Event Hub input.</p>
<figure>
<img src="media/stream-analytics-add-input-link.png" title="Add stream input - Event Hub" alt="The Add stream input button and Event Hub menu item are highlighted." /><figcaption>The Add stream input button and Event Hub menu item are highlighted.</figcaption>
</figure></li>
<li><p>In the <strong>New Input</strong> blade, configure the following:</p>
<ul>
<li><strong>Name:</strong> Enter “eventhub”.</li>
<li><strong>Select Event Hub from your subscriptions:</strong> Selected.</li>
<li><strong>Subscription:</strong> Make sure the subscription you are using for this lab is selected.</li>
<li><strong>Event Hub namespace:</strong> Select the Event Hub namespace you are using for this lab.</li>
<li><strong>Event Hub name:</strong> Select <strong>Use existing</strong>, then select <strong>telemetry</strong>, which you created earlier.</li>
<li><strong>Event Hub policy name:</strong> Select <strong>Read</strong>.</li>
<li>Leave all other values at their defaults.</li>
</ul>
<figure>
<img src="media/stream-analytics-new-input.png" title="New Input" alt="The New Input form is filled out with the previously mentioned settings entered into the appropriate fields." /><figcaption>The New Input form is filled out with the previously mentioned settings entered into the appropriate fields.</figcaption>
</figure></li>
<li><p>Select <strong>Save</strong> on the bottom of the form when you are finished entering the values.</p></li>
<li><p>Within the Stream Analytics job blade, select <strong>Outputs</strong> within the left-hand menu.</p>
<figure>
<img src="media/outputs-link.png" title="Outputs link" alt="The Outputs link is selected in the left-hand menu." /><figcaption>The Outputs link is selected in the left-hand menu.</figcaption>
</figure></li>
<li><p>Select <strong>+ Add</strong> in the top toolbar, then select <strong>Power BI</strong> to create a new Power BI output.</p>
<figure>
<img src="media/stream-analytics-add-output-link.png" title="Add output - Power BI" alt="The Add button and Power BI menu item are highlighted." /><figcaption>The Add button and Power BI menu item are highlighted.</figcaption>
</figure></li>
<li><p>In the <strong>New Output</strong> blade, select the <strong>Authorize</strong> button to authorize a connection from Stream Analytics to your Power BI account.</p>
<figure>
<img src="media/stream-analytics-new-output-authorize.png" title="New Output" alt="The Authorize button is highlighted in the New Output blade." /><figcaption>The Authorize button is highlighted in the New Output blade.</figcaption>
</figure></li>
<li><p>When prompted, sign in to your Power BI account, which is the same username and password you were provided with and used to login to the Azure Portal.</p>
<figure>
<img src="media/power-bi-sign-in.png" title="Power BI Sign In" alt="The Power BI sign in form is displayed." /><figcaption>The Power BI sign in form is displayed.</figcaption>
</figure></li>
<li><p>After successfully signing in to your Power BI account, the New Output blade will update to show you are currently authorized.</p>
<figure>
<img src="media/stream-analytics-new-output-authorized.png" title="Authorized" alt="The New Output blade has been updated to show user is authorized to Power BI." /><figcaption>The New Output blade has been updated to show user is authorized to Power BI.</figcaption>
</figure></li>
<li><p>In the <strong>New Output</strong> blade, configure the following:</p>
<ul>
<li><strong>Output alias:</strong> Enter “powerBIAlerts”.</li>
<li><strong>Group workspace:</strong> Select My Workspace.</li>
<li><strong>Dataset name:</strong> Enter “VehicleAnomalies”.</li>
<li><strong>Table name:</strong> Enter “Alerts”.</li>
</ul>
<figure>
<img src="media/stream-analytics-new-output.png" title="New Output" alt="The New Output form is filled out with the previously mentioned settings entered into the appropriate fields." /><figcaption>The New Output form is filled out with the previously mentioned settings entered into the appropriate fields.</figcaption>
</figure></li>
<li><p>Select <strong>Save</strong> on the bottom of the form when you are finished entering the values.</p></li>
<li><p>Within the Stream Analytics job blade, select <strong>Query</strong> within the left-hand menu.</p>
<figure>
<img src="media/query-link.png" title="Query link" alt="The Query link is selected in the left-hand menu." /><figcaption>The Query link is selected in the left-hand menu.</figcaption>
</figure></li>
<li><p>Clear the edit <strong>Query</strong> window and paste the following in its place:</p>
<div class="sourceCode" id="cb1"><pre class="sourceCode sql"><code class="sourceCode sql"><a class="sourceLine" id="cb1-1" title="1"><span class="kw">WITH</span></a>
<a class="sourceLine" id="cb1-2" title="2">Averages <span class="kw">AS</span> (</a>
<a class="sourceLine" id="cb1-3" title="3"><span class="kw">select</span></a>
<a class="sourceLine" id="cb1-4" title="4"> <span class="fu">AVG</span>(engineTemperature) averageEngineTemperature,</a>
<a class="sourceLine" id="cb1-5" title="5"> <span class="fu">AVG</span>(speed) averageSpeed</a>
<a class="sourceLine" id="cb1-6" title="6"><span class="kw">FROM</span></a>
<a class="sourceLine" id="cb1-7" title="7"> eventhub <span class="dt">TIMESTAMP</span> <span class="kw">BY</span> [<span class="dt">timestamp</span>]</a>
<a class="sourceLine" id="cb1-8" title="8"><span class="kw">GROUP</span> <span class="kw">BY</span></a>
<a class="sourceLine" id="cb1-9" title="9"> TumblingWindow(Duration(<span class="dt">second</span>, <span class="dv">2</span>))</a>
<a class="sourceLine" id="cb1-10" title="10">),</a>
<a class="sourceLine" id="cb1-11" title="11">Anomalies <span class="kw">AS</span> (</a>
<a class="sourceLine" id="cb1-12" title="12"><span class="kw">select</span></a>
<a class="sourceLine" id="cb1-13" title="13"> t.vin,</a>
<a class="sourceLine" id="cb1-14" title="14"> t.[<span class="dt">timestamp</span>],</a>
<a class="sourceLine" id="cb1-15" title="15"> t.city,</a>
<a class="sourceLine" id="cb1-16" title="16"> t.region,</a>
<a class="sourceLine" id="cb1-17" title="17"> t.outsideTemperature,</a>
<a class="sourceLine" id="cb1-18" title="18"> t.engineTemperature,</a>
<a class="sourceLine" id="cb1-19" title="19"> a.averageEngineTemperature,</a>
<a class="sourceLine" id="cb1-20" title="20"> t.speed,</a>
<a class="sourceLine" id="cb1-21" title="21"> a.averageSpeed,</a>
<a class="sourceLine" id="cb1-22" title="22"> t.fuel,</a>
<a class="sourceLine" id="cb1-23" title="23"> t.engineoil,</a>
<a class="sourceLine" id="cb1-24" title="24"> t.tirepressure,</a>
<a class="sourceLine" id="cb1-25" title="25"> t.odometer,</a>
<a class="sourceLine" id="cb1-26" title="26"> t.accelerator_pedal_position,</a>
<a class="sourceLine" id="cb1-27" title="27"> t.parking_brake_status,</a>
<a class="sourceLine" id="cb1-28" title="28"> t.headlamp_status,</a>
<a class="sourceLine" id="cb1-29" title="29"> t.brake_pedal_status,</a>
<a class="sourceLine" id="cb1-30" title="30"> t.transmission_gear_position,</a>
<a class="sourceLine" id="cb1-31" title="31"> t.ignition_status,</a>
<a class="sourceLine" id="cb1-32" title="32"> t.windshield_wiper_status,</a>
<a class="sourceLine" id="cb1-33" title="33"> t.<span class="fu">abs</span>,</a>
<a class="sourceLine" id="cb1-34" title="34"> (<span class="cf">case</span> <span class="cf">when</span> a.averageEngineTemperature <span class="op">>=</span> <span class="dv">405</span> <span class="kw">OR</span> a.averageEngineTemperature <span class="op"><=</span> <span class="dv">15</span> <span class="cf">then</span> <span class="dv">1</span> <span class="cf">else</span> <span class="dv">0</span> <span class="cf">end</span>) <span class="kw">as</span> enginetempanomaly,</a>
<a class="sourceLine" id="cb1-35" title="35"> (<span class="cf">case</span> <span class="cf">when</span> t.engineoil <span class="op"><=</span> <span class="dv">1</span> <span class="cf">then</span> <span class="dv">1</span> <span class="cf">else</span> <span class="dv">0</span> <span class="cf">end</span>) <span class="kw">as</span> oilanomaly,</a>
<a class="sourceLine" id="cb1-36" title="36"> (<span class="cf">case</span> <span class="cf">when</span> (t.transmission_gear_position <span class="op">=</span> <span class="st">'first'</span> <span class="kw">OR</span></a>
<a class="sourceLine" id="cb1-37" title="37"> t.transmission_gear_position <span class="op">=</span> <span class="st">'second'</span> <span class="kw">OR</span></a>
<a class="sourceLine" id="cb1-38" title="38"> t.transmission_gear_position <span class="op">=</span> <span class="st">'third'</span>) <span class="kw">AND</span></a>
<a class="sourceLine" id="cb1-39" title="39"> t.brake_pedal_status <span class="op">=</span> <span class="dv">1</span> <span class="kw">AND</span></a>
<a class="sourceLine" id="cb1-40" title="40"> t.accelerator_pedal_position <span class="op">>=</span> <span class="dv">90</span> <span class="kw">AND</span></a>
<a class="sourceLine" id="cb1-41" title="41"> a.averageSpeed <span class="op">>=</span> <span class="dv">55</span> <span class="cf">then</span> <span class="dv">1</span> <span class="cf">else</span> <span class="dv">0</span> <span class="cf">end</span>) <span class="kw">as</span> aggressivedriving</a>
<a class="sourceLine" id="cb1-42" title="42"><span class="kw">from</span> eventhub t <span class="dt">TIMESTAMP</span> <span class="kw">BY</span> [<span class="dt">timestamp</span>]</a>
<a class="sourceLine" id="cb1-43" title="43"><span class="kw">INNER</span> <span class="kw">JOIN</span> Averages a <span class="kw">ON</span> DATEDIFF(<span class="dt">second</span>, t, a) <span class="kw">BETWEEN</span> <span class="dv">0</span> <span class="kw">And</span> <span class="dv">2</span></a>
<a class="sourceLine" id="cb1-44" title="44">)</a>
<a class="sourceLine" id="cb1-45" title="45"><span class="kw">SELECT</span></a>
<a class="sourceLine" id="cb1-46" title="46"> <span class="op">*</span></a>
<a class="sourceLine" id="cb1-47" title="47"><span class="kw">INTO</span></a>
<a class="sourceLine" id="cb1-48" title="48"> powerBIAlerts</a>
<a class="sourceLine" id="cb1-49" title="49"><span class="kw">FROM</span></a>
<a class="sourceLine" id="cb1-50" title="50"> Anomalies</a>
<a class="sourceLine" id="cb1-51" title="51"><span class="kw">where</span> aggressivedriving <span class="op">=</span> <span class="dv">1</span> <span class="kw">OR</span> enginetempanomaly <span class="op">=</span> <span class="dv">1</span> <span class="kw">OR</span> oilanomaly <span class="op">=</span> <span class="dv">1</span></a></code></pre></div>
<figure>
<img src="media/stream-analytics-query.png" title="Query window" alt="The query above has been inserted into the Query window." /><figcaption>The query above has been inserted into the Query window.</figcaption>
</figure>
<p>The query averages the engine temperature and speed over a two second duration. Then it selects all telemetry data, including the average values from the previous step, and specifies the following anomalies as new fields:</p>
<ol type="a">
<li><p><strong>enginetempanomaly</strong>: When the average engine temperature is >= 405 or <= 15.</p></li>
<li><p><strong>oilanomaly</strong>: When the engine oil <= 1.</p></li>
<li><p><strong>aggressivedriving</strong>: When the transmission gear position is in first, second, or third, and the brake pedal status is 1, the accelerator pedal position >= 90, and the average speed is >= 55.</p></li>
</ol>
<p>Finally, the query outputs all fields from the anomalies step into the <code>powerBIAlerts</code> output where aggressivedriving = 1 or enginetempanomaly = 1 or oilanomaly = 1.</p></li>
<li><p>Select <strong>Save</strong> in the top toolbar when you are finished updating the query.</p></li>
<li><p>Within the Stream Analytics job blade, select <strong>Overview</strong> within the left-hand menu. On top of the Overview blade, select <strong>Start</strong>.</p>
<figure>
<img src="media/stream-analytics-overview-start-button.png" title="Overview" alt="The Start button is highlighted on top of the Overview blade." /><figcaption>The Start button is highlighted on top of the Overview blade.</figcaption>
</figure></li>
<li><p>In the Start job blade that appears, select <strong>Now</strong> for the job output start time, then select <strong>Start</strong>. This will start the Stream Analytics job so it will be ready to start processing and sending your events to Power BI later on.</p>
<figure>
<img src="media/stream-analytics-start-job.png" title="Start job" alt="The Now and Start buttons are highlighted within the Start job blade." /><figcaption>The Now and Start buttons are highlighted within the Start job blade.</figcaption>
</figure></li>
</ol>
<h2 id="task-4-configure-azure-function-app">Task 4: Configure Azure Function App</h2>
<p>Azure Functions is a solution for easily running small pieces of code, or “functions,” in the cloud. You can write just the code you need for the problem at hand, without worrying about a whole application or the infrastructure to run it. Functions can make development even more productive, and you can use your development language of choice, such as C#, F#, Node.js, Java, or PHP.</p>
<p>When you use the Azure Functions consumption plan, you only pay for the time your code runs. Azure automatically handles scaling your functions to meet demand.</p>
<p>Azure Functions uses special bindings that allow you to automatically trigger the function when an event happens (a document is added to Azure Cosmos DB, a file is uploaded to blob storage, an event is added to Event Hubs, an HTTP request to the function is made, etc.), as well as to retrieve or send information to and from various Azure services. In the case of our function for this solution, we are using the <code>CosmosDBTrigger</code> to automatically trigger the function through the Cosmos DB change feed. This trigger supplies an input binding of type <code>IReadOnlyList<Document></code> we name “input”, that contains the records that triggered the function. This removes any code you would have to otherwise write to query that data. We also have an output binding to the event hub, of type <code>IAsyncCollector<EventData></code>, which we name “eventHubOutput”. Again, this reduces code by automatically sending data added to this collection to the specified event hub.</p>
<figure>
<img src="media/function-definition.png" title="Azure function" alt="The Cosmos DB trigger, input binding, and output binding are highlighted." /><figcaption>The Cosmos DB trigger, input binding, and output binding are highlighted.</figcaption>
</figure>
<p>The function code itself is very lightweight, lending to the resource bindings and the <code>TelemetryProcessing.ProcessEvent</code> method that it calls:</p>
<figure>
<img src="media/function-code.png" title="Function code" alt="The function code is very lightweight." /><figcaption>The function code is very lightweight.</figcaption>
</figure>
<p>The <code>TelemetryProcessing</code> class contains a simple method named <code>ProcessEvent</code> that evaluates the vehicle telemetry data sent by Cosmos DB and enriches it with the region name based on a simple map.</p>
<figure>
<img src="media/function-process-event.png" title="TelemetryProcessing.ProcessEvent method" alt="Source code for the ProcessEvent method." /><figcaption>Source code for the ProcessEvent method.</figcaption>
</figure>
<p>In this task, you will configure the Function App with the Azure Cosmos DB and Event Hubs connection strings.</p>
<ol type="1">
<li><p>Navigate to the <a href="https://portal.azure.com">Azure portal</a>.</p></li>
<li><p>Select <strong>Resource groups</strong> from the left-hand menu. Then select the resource group named <strong>tech-immersion-YOUR_UNIQUE_IDENTIFIER</strong>.</p>
<figure>
<img src="media/tech-immersion-rg.png" title="Resource groups" alt="The tech-immersion resource group is selected." /><figcaption>The tech-immersion resource group is selected.</figcaption>
</figure></li>
<li><p>Select the <strong>App Service</strong> (Azure Function App) that includes <strong>day1</strong> in its name from the list of resources in your resource group.</p>
<figure>
<img src="media/tech-immersion-rg-function-app.png" title="tech-immersion resource group" alt="The App Service Function App is selected in the resource group." /><figcaption>The App Service Function App is selected in the resource group.</figcaption>
</figure></li>
<li><p>Within the Function App Overview blade, scroll down and select <strong>Application settings</strong>.</p>
<figure>
<img src="media/function-app-app-settings-link.png" title="Function App overview" alt="The Function App Overview blade is displayed with the Application Settings link highlighted." /><figcaption>The Function App Overview blade is displayed with the Application Settings link highlighted.</figcaption>
</figure></li>
<li><p>Select <strong>Add new setting</strong> at the bottom of the Application settings section.</p>
<figure>
<img src="media/function-app-app-settings-new-link.png" title="Application settings" alt="The Add new setting link is highlighted on the bottom of the Application settings section." /><figcaption>The Add new setting link is highlighted on the bottom of the Application settings section.</figcaption>
</figure></li>
<li><p>Enter <code>CosmosDbConnectionString</code> into the <strong>Name</strong> field, then paste your Cosmos DB connection string into the <strong>Value</strong> field. If you cannot locate your connection string, refer to Task 1, step 10.</p>
<figure>
<img src="media/function-app-app-settings-cosmos-db.png" title="Application settings" alt="The CosmosDbConnectionString name and value pair has been added and is highlighted." /><figcaption>The CosmosDbConnectionString name and value pair has been added and is highlighted.</figcaption>
</figure></li>
<li><p>Select <strong>Add new setting</strong> underneath the new application setting you just added to add a new one.</p></li>
<li><p>Enter <code>EventHubsConnectionString</code> into the <strong>Name</strong> field, then paste your Event Hubs connection string into the <strong>Value</strong> field. This is the connection string for the <strong>Write</strong> shared access policy you created. If you cannot locate your connection string, refer to Task 2, step 17.</p>
<figure>
<img src="media/function-app-app-settings-event-hubs.png" title="Application settings" alt="The EventHubsConnectionString name and value pair has been added and is highlighted." /><figcaption>The EventHubsConnectionString name and value pair has been added and is highlighted.</figcaption>
</figure></li>
<li><p>Scroll to the top of the page and select <strong>Save</strong> in the top toolbar to apply your changes.</p>
<figure>
<img src="media/function-app-app-settings-save.png" title="Application settings" alt="The Save button is highlighted on top of the Application settings blade." /><figcaption>The Save button is highlighted on top of the Application settings blade.</figcaption>
</figure></li>
</ol>
<h2 id="task-5-publish-function-app-and-run-data-generator">Task 5: Publish Function App and run data generator</h2>
<p>The data generator console application creates and sends simulated vehicle sensor telemetry for an array of vehicles (denoted by VIN (vehicle identification number)) directly to Cosmos DB. For this to happen, you first need to configure it with the Cosmos DB connection string.</p>
<p>In this task, you will open the lab solution in Visual Studio, publish the Function App, and configure and run the data generator. The data generator saves simulated vehicle telemetry data to Cosmos DB, which triggers the Azure function to run and process the data, sending it to Event Hubs, prompting your Stream Analytics job to aggregate and analyze the enriched data and send it to Power BI. The final step will be to create the Power BI report in the task that follows.</p>
<ol type="1">
<li><p>Open File Explorer and navigate to <code>C:\lab-files\data\2</code>. Double-click on <strong>TechImmersion.sln</strong> to open the solution in Visual Studio. <strong>NOTE:</strong> If you are prompted by Visual Studio to log in, log in with your Azure Active Directory credentials you are using for this lab (they are called <code>Username</code> and <code>Password</code> under the heading Azure Credentials in the documentation).</p>
<figure>
<img src="media/vs-solution.png" title="Windows explorer" alt="The TechImmersion.sln file is highlighted in the C:-immersion folder." /><figcaption>The TechImmersion.sln file is highlighted in the C:-immersion folder.</figcaption>
</figure>
<p>The Visual Studio solution contains the following projects:</p>
<ul>
<li><strong>TechImmersion.CarEventProcessor</strong>: Azure Function App project from which you will publish the Azure function that processes Cosmos DB documents as they arrive, and sends them to Event Hubs.</li>
<li><strong>TechImmersion.Common</strong>: Common library that contains models and structs used by the other projects within the solution.</li>
<li><strong>TransactionGenerator</strong>: Console app that generates simulated vehicle telemetry and writes it to Cosmos DB.</li>
</ul></li>
<li><p>Select the <strong>Build</strong> menu item, then select <strong>Build Solution</strong>. You should see a message in the output window on the bottom of the Visual Studio window that the build successfully completed. One of the operations that completes during this process is to download and install all NuGet packages.</p>
<figure>
<img src="media/vs-build-solution.png" title="Build Solution" alt="The Build menu item and Build Solution sub-menu item are highlighted." /><figcaption>The Build menu item and Build Solution sub-menu item are highlighted.</figcaption>
</figure></li>
<li><p>You will see the projects listed within the Solution Explorer in Visual Studio. Right-click the <strong>TechImmersion.CarEventProcessor</strong> solution, then select <strong>Publish…</strong> in the context menu.</p>
<figure>
<img src="media/vs-publish-link.png" title="Solution Explorer" alt="The TechImmersion.CarEventProcessor project and the Publish menu item are highlighted." /><figcaption>The TechImmersion.CarEventProcessor project and the Publish menu item are highlighted.</figcaption>
</figure></li>
<li><p>Select <strong>Select Existing</strong> underneath Azure App Service since you will be publishing this to an existing Function App. Click <strong>Publish</strong> on the bottom of the dialog window. If you are prompted to log into your Azure Account, log in with the Azure account you are using for this lab.</p>
<figure>
<img src="media/vs-publish-target.png" title="Pick a publish target" alt="The Select Existing radio button and Publish button are highlighted." /><figcaption>The Select Existing radio button and Publish button are highlighted.</figcaption>
</figure></li>
<li><p>In the App Service dialog that follows, make sure your Azure <strong>Subscription</strong> for this lab is selected, then find and expand the <strong>tech-immersion-YOUR_UNIQUE_IDENTIFIER</strong> resource group. Select your Function App that includes <strong>day1</strong> in its name, then click <strong>OK</strong> on the bottom of the dialog window</p>
<figure>
<img src="media/vs-publish-app-service.png" title="App Service" alt="The Function App and OK button are highlighted." /><figcaption>The Function App and OK button are highlighted.</figcaption>
</figure></li>
<li><p>The Function App will start publishing in a moment. You can watch the output window for the publish status. When it is done publishing, you should see a “Publish completed” message on the bottom of the output window.</p>
<figure>
<img src="media/vs-publish-output.png" title="Publish output" alt="The Publish Succeeded and Publish Completed messages are highlighted in the output window." /><figcaption>The Publish Succeeded and Publish Completed messages are highlighted in the output window.</figcaption>
</figure></li>
<li><p>Expand the <strong>TransactionGenerator</strong> project within the Solution Explorer, then double-click on <strong>appsettings.json</strong> to open it.</p>
<figure>
<img src="media/vs-appsettings-link.png" title="Solution Explorer" alt="The appsettings.json file is highlighted in Solution Explorer." /><figcaption>The appsettings.json file is highlighted in Solution Explorer.</figcaption>
</figure></li>
<li><p>Paste your Cosmos DB connection string value next to <code>COSMOS_DB_CONNECTION_STRING</code>. Make sure you have quotes ("") around the value, as shown. <strong>Save</strong> the file.</p>
<figure>
<img src="media/vs-appsettings.png" title="appsettings.json" alt="The Cosmos DB connection string is highlighted within the appsettings.json file." /><figcaption>The Cosmos DB connection string is highlighted within the appsettings.json file.</figcaption>
</figure>
<p><code>SECONDS_TO_LEAD</code> is the amount of time to wait before sending vehicle telemetry data. Default value is <code>0</code>.</p>
<p><code>SECONDS_TO_RUN</code> is the maximum amount of time to allow the generator to run before stopping transmission of data. The default value is <code>1800</code>. Data will also stop transmitting when you enter Ctrl+C while the generator is running, or if you close the window.</p></li>
<li><p>Now you are ready to run the transaction generator. Select the <strong>Debug</strong> menu item, then select <strong>Start Debugging</strong>, or press <em>F-5</em> on your keyboard.</p>
<figure>
<img src="media/vs-debug.png" title="Debug" alt="The Debug menu item and Start Debugging sub-menu item are selected" /><figcaption>The Debug menu item and Start Debugging sub-menu item are selected</figcaption>
</figure></li>
<li><p>A new console window will open, and you should see it start to send data after a few seconds. Once you see that it is sending data to Cosmos DB, <em>minimize</em> the window and keep it running in the background.</p>
<figure>
<img src="media/vs-console.png" title="Console window" alt="Screenshot of the console window." /><figcaption>Screenshot of the console window.</figcaption>
</figure>
<p>The top of the output displays information about the Cosmos DB collection you created (telemetry), the requested RU/s as well as estimated hourly and monthly cost. After every 500 records are requested to be sent, you will see output statistics.</p></li>
</ol>
<hr />
<p>Some key areas to point out about the data generator code are as follows:</p>
<p>Within the <code>Program.cs</code> file, we instantiate a new Cosmos DB client (<code>DocumentClient</code>), passing in the Cosmos DB service endpoint, authorization key, and connection policy (direct connect over TCP for fastest results). Next, we retrieve the Cosmos DB collection information and create an offer query (<code>CreateOfferQuery</code>) to pull statistics about the offered throughput in RU/s so we can estimate the monthly and hourly cost. Finally, we call the <code>SendData</code> method to start sending telemetry data to Cosmos DB.</p>
<figure>
<img src="media/telemetry-generator-code.png" title="Telemetry generator code" alt="The telemetry generator code is displayed showing the Cosmos DB client instantiation." /><figcaption>The telemetry generator code is displayed showing the Cosmos DB client instantiation.</figcaption>
</figure>
<p>The <code>SendData</code> method outputs statistics about how much data was sent to Cosmos DB and how long it took to send, which varies based on your available system resources and internet bandwidth. It sends the telemetry data (<code>carEvent</code>) in one line of code:</p>
<div class="sourceCode" id="cb2"><pre class="sourceCode csharp"><code class="sourceCode cs"><a class="sourceLine" id="cb2-1" title="1"><span class="co">// Send to Cosmos DB:</span></a>
<a class="sourceLine" id="cb2-2" title="2"><span class="dt">var</span> response = await _cosmosDbClient.<span class="fu">CreateDocumentAsync</span>(collectionUri, carEvent)</a>
<a class="sourceLine" id="cb2-3" title="3"> .<span class="fu">ConfigureAwait</span>(<span class="kw">false</span>);</a></code></pre></div>
<p>The last bit of interesting code within the generator is where we create the Cosmos DB database and collection if it does not exist. We also specify the collection partition key, indexing policy, and the throughput set to 15,000 RU/s:</p>
<figure>
<img src="media/telemetry-generator-initialize-cosmos.png" title="InitializeCosmosDB method" alt="The InitializeCosmosDB method code." /><figcaption>The InitializeCosmosDB method code.</figcaption>
</figure>
<h2 id="task-6-view-published-function">Task 6: View published function</h2>
<p>A few minutes ago, you published your Azure Function App from Visual Studio. This Function App contains a single function, <code>CarEventProcessor</code>. We will take a look at the published function in this task.</p>
<p>You will notice that the Function App is now set to read-only. This is because you published a generated function.json file from Visual Studio. Function Apps adds this protection when you publish so you do not accidentally overwrite the function.json file or related files through the UI.</p>
<ol type="1">
<li><p>Navigate to the <a href="https://portal.azure.com">Azure portal</a>.</p></li>
<li><p>Select <strong>Resource groups</strong> from the left-hand menu. Then select the resource group named <strong>tech-immersion-YOUR_UNIQUE_IDENTIFIER</strong>.</p>
<figure>
<img src="media/tech-immersion-rg.png" title="Resource groups" alt="The tech-immersion resource group is selected." /><figcaption>The tech-immersion resource group is selected.</figcaption>
</figure></li>
<li><p>Select the <strong>App Service</strong> (Azure Function App) that includes <strong>day1</strong> in its name from the list of resources in your resource group.</p>
<figure>
<img src="media/tech-immersion-rg-function-app.png" title="tech-immersion resource group" alt="The App Service Function App is selected in the resource group." /><figcaption>The App Service Function App is selected in the resource group.</figcaption>
</figure></li>
<li><p>Expand <strong>Functions (Read Only)</strong> within the navigation tree to the left, then select <strong>CarEventProcessor</strong>.</p>
<figure>
<img src="media/function-app-tree.png" title="Functions" alt="The Functions node is expanded in the navigation tree, and the CarEventProcessor is selected." /><figcaption>The Functions node is expanded in the navigation tree, and the CarEventProcessor is selected.</figcaption>
</figure></li>
<li><p>Looking at the <strong>function.json</strong> file to the right, notice that it was generated for you when you published from Visual Studio. Also notice how the <code>bindings</code> section lines up with the function method in <code>CarEventProcessorFunctions.cs</code>:</p>
<div class="sourceCode" id="cb3"><pre class="sourceCode csharp"><code class="sourceCode cs"><a class="sourceLine" id="cb3-1" title="1">[<span class="fu">FunctionName</span>(<span class="st">"CarEventProcessor"</span>)]</a>
<a class="sourceLine" id="cb3-2" title="2"><span class="kw">public</span> <span class="kw">static</span> async Task <span class="fu">CarEventProcessor</span>([<span class="fu">CosmosDBTrigger</span>(</a>
<a class="sourceLine" id="cb3-3" title="3"> databaseName: <span class="st">"ContosoAuto"</span>,</a>
<a class="sourceLine" id="cb3-4" title="4"> collectionName: <span class="st">"telemetry"</span>,</a>
<a class="sourceLine" id="cb3-5" title="5"> ConnectionStringSetting = <span class="st">"CosmosDbConnectionString"</span>,</a>
<a class="sourceLine" id="cb3-6" title="6"> LeaseCollectionName = <span class="st">"leases"</span>,</a>
<a class="sourceLine" id="cb3-7" title="7"> CreateLeaseCollectionIfNotExists = <span class="kw">true</span>)]IReadOnlyList<Document> input,</a>
<a class="sourceLine" id="cb3-8" title="8"> [<span class="fu">EventHub</span>(<span class="st">"telemetry"</span>,</a>
<a class="sourceLine" id="cb3-9" title="9"> Connection=<span class="st">"EventHubsConnectionString"</span>)]IAsyncCollector<EventData> eventHubOutput,</a>
<a class="sourceLine" id="cb3-10" title="10"> ILogger log)</a></code></pre></div>
<figure>
<img src="media/function-app-functions-json.png" title="function.json" alt="The function.json file is displayed with the bindings highlighted." /><figcaption>The function.json file is displayed with the bindings highlighted.</figcaption>
</figure></li>
</ol>
<h2 id="task-7-create-power-bi-dashboard">Task 7: Create Power BI dashboard</h2>
<p>In this task, you will use Power BI to create a report showing captured vehicle anomaly data. Then you will pin that report to a live dashboard for near real-time updates.</p>
<ol type="1">
<li><p>Open your web browser and navigate to <a href="https://powerbi.microsoft.com/" class="uri">https://powerbi.microsoft.com/</a>. Select <strong>Sign in</strong> on the upper-right.</p>
<figure>
<img src="media/pbi-signin.png" title="Power BI home page" alt="The Power BI home page is shown with the Sign in link highlighted." /><figcaption>The Power BI home page is shown with the Sign in link highlighted.</figcaption>
</figure></li>
<li><p>Enter your Power BI credentials you used when creating the Power BI output for Stream Analytics.</p></li>
<li><p>After signing in, select <strong>My Workspace</strong> on the left-hand menu.</p>
<figure>
<img src="media/pbi-my-workspace-link.png" title="My Workspace" alt="The My Workspace link is selected on the left-hand menu." /><figcaption>The My Workspace link is selected on the left-hand menu.</figcaption>
</figure></li>
<li><p>Select the <strong>Datasets</strong> tab on top of the workspace. Locate the dataset named <strong>VehicleAnomalies</strong>, then select the <strong>Create Report</strong> action button to the right of the name. If you do not see the dataset, you may need to wait a few minutes and refresh the page.</p>
<figure>
<img src="media/pbi-my-workspace.png" title="Datasets" alt="The Datasets tab is selected in My Workspace and the VehicleAnomalies dataset is highlighted." /><figcaption>The Datasets tab is selected in My Workspace and the VehicleAnomalies dataset is highlighted.</figcaption>
</figure>
<blockquote>
<p><strong>Note:</strong> It can take several minutes for the dataset to appear. You may need to periodically refresh the page before you see the Datasets tab.</p>
</blockquote></li>
<li><p>You should see a new blank report for VehicleAnomalies with the field list on the far right.</p>
<figure>
<img src="media/pbi-blank-report.png" title="Blank report" alt="A new blank report is displayed with the field list on the right." /><figcaption>A new blank report is displayed with the field list on the right.</figcaption>
</figure></li>
<li><p>Select the <strong>Map</strong> visualization within the Visualizations section on the right.</p>
<figure>
<img src="media/pbi-map-vis.png" title="Visualizations" alt="The Map visualization is highlighted." /><figcaption>The Map visualization is highlighted.</figcaption>
</figure></li>
<li><p>Drag the <strong>city</strong> field to <strong>Location</strong>, and <strong>aggressivedriving</strong> to <strong>Size</strong>. This will place points of different sizes over cities on the map, depending on how many aggressive driving records there are.</p>
<figure>
<img src="media/pbi-map-fields.png" title="Map settings" alt="Screenshot displaying where to drag the fields onto the map settings." /><figcaption>Screenshot displaying where to drag the fields onto the map settings.</figcaption>
</figure></li>
<li><p>Your map should look similar to the following:</p>
<figure>
<img src="media/pbi-map.png" title="Map" alt="The map is shown on the report." /><figcaption>The map is shown on the report.</figcaption>
</figure></li>
<li><p>Select a blank area on the report to deselect the map. Now select the <strong>Treemap</strong> visualization.</p>
<figure>
<img src="media/pbi-treemap-vis.png" title="Visualization" alt="The Treemap visualization is highlighted." /><figcaption>The Treemap visualization is highlighted.</figcaption>
</figure></li>
<li><p>Drag the <strong>enginetemperature</strong> field to <strong>Values</strong>, then drag the <strong>transmission_gear_position</strong> field to <strong>Group</strong>. This will group the engine temperature values by the transmission gear position on the treemap so you can see which gears are associated with the hottest or coolest engine temperatures. The treemap sizes the groups according to the values, with the largest appearing on the upper-left and the lowest on the lower-right.</p>
<figure>
<img src="media/pbi-treemap-fields.png" title="Treemap settings" alt="Screenshot displaying where to drag the fields onto the treemap settings." /><figcaption>Screenshot displaying where to drag the fields onto the treemap settings.</figcaption>
</figure></li>
<li><p>Select the down arrow next to the <strong>enginetemperature</strong> field under <strong>Values</strong>. Select <strong>Average</strong> from the menu to aggregate the values by average instead of the sum.</p>
<figure>
<img src="media/pbi-treemap-average.png" title="Average engine temperature" alt="The Average menu option is highlighted for the enginetemperature value." /><figcaption>The Average menu option is highlighted for the enginetemperature value.</figcaption>
</figure></li>
<li><p>Your treemap should look similar to the following:</p>
<figure>
<img src="media/pbi-treemap.png" title="Treemap" alt="The treemap is shown on the report." /><figcaption>The treemap is shown on the report.</figcaption>
</figure></li>
<li><p>Select a blank area on the report to deselect the treemap. Now select the <strong>Area chart</strong> visualization.</p>
<figure>
<img src="media/pbi-areachart-vis.png" title="Area chart visualization" alt="The Area chart visualization is highlighted." /><figcaption>The Area chart visualization is highlighted.</figcaption>
</figure></li>
<li><p>Drag the <strong>region</strong> field to <strong>Legend</strong>, the <strong>speed</strong> field to <strong>Values</strong>, and the <strong>timestamp</strong> field to <strong>Axis</strong>. This will display an area chart with different colors indicating the region and the speed at which drivers travel over time within that region.</p>
<figure>
<img src="media/pbi-areachart-fields.png" title="Area chart settings" alt="Screenshot displaying where to drag the fields onto the area chart settings." /><figcaption>Screenshot displaying where to drag the fields onto the area chart settings.</figcaption>
</figure></li>
<li><p>Select the down arrow next to the <strong>speed</strong> field under <strong>Values</strong>. Select <strong>Average</strong> from the menu to aggregate the values by average instead of the sum.</p>
<figure>
<img src="media/pbi-areachart-average.png" title="Average speed" alt="The Average menu option is highlighted for the speed value." /><figcaption>The Average menu option is highlighted for the speed value.</figcaption>
</figure></li>
<li><p>Your area chart should look similar to the following:</p>
<figure>
<img src="media/pbi-areachart.png" title="Area chart" alt="The area chart on the report." /><figcaption>The area chart on the report.</figcaption>
</figure></li>
<li><p>Select a blank area on the report to deselect the area chart. Now select the <strong>Multi-row card</strong> visualization.</p>
<figure>
<img src="media/pbi-card-vis.png" title="Multi-row card visualization" alt="The multi-card visualization is highlighted." /><figcaption>The multi-card visualization is highlighted.</figcaption>
</figure></li>
<li><p>Drag the <strong>aggressivedriving</strong> field, <strong>enginetempanomaly</strong>, and <strong>oilanomaly</strong> fields to <strong>Fields</strong>.</p>
<figure>
<img src="media/pbi-card-fields.png" title="Multi-row card settings" alt="Screenshot displaying where to drag the fields onto the multi-row card settings." /><figcaption>Screenshot displaying where to drag the fields onto the multi-row card settings.</figcaption>
</figure></li>
<li><p>Select the <strong>Format</strong> tab in the multi-row card settings, then expand <strong>Data labels</strong>. Set the <strong>Text size</strong> to 30. Expand <strong>Category labels</strong> and set the <strong>Text size</strong> to 12.</p>
<figure>
<img src="media/pbi-card-format.png" title="Multi-row card format" alt="Screenshot of the format tab." /><figcaption>Screenshot of the format tab.</figcaption>
</figure></li>
<li><p>Your multi-row card should look similar to the following:</p>
<figure>
<img src="media/pbi-card.png" title="Multi-row-card" alt="The multi-row card on the report." /><figcaption>The multi-row card on the report.</figcaption>
</figure></li>
<li><p>Select <strong>Save</strong> on the upper-right of the page.</p>
<figure>
<img src="media/pbi-save.png" title="Save" alt="The save button is highlighted." /><figcaption>The save button is highlighted.</figcaption>
</figure></li>
<li><p>Enter a name, such as “Vehicle Anomalies”, then select <strong>Save</strong>.</p>
<figure>
<img src="media/pbi-save-dialog.png" title="Save dialog" alt="Screenshot of the save dialog." /><figcaption>Screenshot of the save dialog.</figcaption>
</figure></li>
<li><p>Now let’s add this report to a dashboard. Select <strong>Pin Live Page</strong> on the upper-right of the page.</p>
<figure>
<img src="media/pbi-live.png" title="Pin Live Page" alt="The Pin Live Page button is highlighted." /><figcaption>The Pin Live Page button is highlighted.</figcaption>
</figure></li>
<li><p>Select <strong>New dashboard</strong>, then enter a name, such as “Vehicle Anomalies Dashboard”. Select <strong>Pin live</strong>. When prompted select the option to view the dashboard. Otherwise, you can find the dashboard under My Workspace on the left-hand menu.</p>
<figure>
<img src="media/pbi-live-dialog.png" title="Pin to dashboard dialog" alt="Screenshot of the pin to dashboard dialog." /><figcaption>Screenshot of the pin to dashboard dialog.</figcaption>
</figure></li>
<li><p>The live dashboard will automatically refresh and update while data is being captured. You can hover over any point on a chart to view information about the item. Select one of the regions in the legend above the average speed chart. All other charts will filter by that region automatically. Click on a blank area of the chart to clear the filter.</p>
<figure>
<img src="media/pbi-dashboard.png" title="Dashboard" alt="The live dashboard view." /><figcaption>The live dashboard view.</figcaption>
</figure></li>
</ol>
<h2 id="wrap-up">Wrap-up</h2>
<p>Thank you for participating in the Leveraging Cosmos DB for near real-time analytics experience! There are many aspects of Cosmos DB that make it suitable for ingesting and serving real-time data at a global scale, some of which we have covered here today. Of course, there are other services that work alongside Cosmos DB to complete the processing pipeline.</p>
<p>To recap, you experienced:</p>
<ul>
<li>How to configure and send real-time data to Cosmos DB.</li>
<li>Processing data as it is saved to Cosmos DB through the use of Azure functions, with the convenience of the Cosmos DB trigger to reduce code and automatically handle kicking off the processing logic as data arrives.</li>
<li>Ingesting processed data with Event Hubs and querying and reshaping that data with Azure Stream Analytics, then sending it to Power BI for reporting.</li>
<li>Rapidly creating a real-time dashboard in Power BI with interesting visualizations to view and explore vehicle anomaly data.</li>
</ul>
<h2 id="additional-resources-and-more-information">Additional resources and more information</h2>
<ul>
<li><a href="https://docs.microsoft.com/en-us/azure/cosmos-db/introduction">Introduction to Azure Cosmos DB</a></li>
<li><a href="https://docs.microsoft.com/en-us/azure/cosmos-db/change-feed">Overview of the Cosmos DB change feed</a></li>
<li><a href="https://docs.microsoft.com/en-us/azure/cosmos-db/high-availability">High availability with Azure Cosmos DB</a></li>
<li><a href="https://docs.microsoft.com/azure/cosmos-db/scaling-throughput">Scaling throughput in Azure Cosmos DB</a></li>
<li><a href="https://docs.microsoft.com/azure/cosmos-db/partition-data">Partitioning and horizontal scaling</a> in Azure Cosmos DB, plus <a href="https://docs.microsoft.com/azure/cosmos-db/scaling-throughput">guide for scaling throughput</a></li>
<li><a href="https://docs.microsoft.com/azure/event-hubs/event-hubs-about">About Event Hubs</a></li>
<li><a href="https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-introduction">What is Azure Stream Analytics?</a></li>
<li><a href="https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-window-functions">Intro to Stream Analytics windowing functions</a></li>
<li><a href="https://docs.microsoft.com/en-us/azure/cosmos-db/change-feed-functions">Trigger Azure Functions from Azure Cosmos DB</a></li>
</ul>