Skip to content

Commit

Permalink
fixing anchor links
Browse files Browse the repository at this point in the history
  • Loading branch information
mumian committed Mar 5, 2015
1 parent 0fdd4c9 commit f4884db
Show file tree
Hide file tree
Showing 7 changed files with 112 additions and 112 deletions.
34 changes: 17 additions & 17 deletions articles/stream-analytics-developer-guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ For more information, see [Introduction to Azure Stream Analytics][stream.analyt
Stream Analytics jobs are defined as one or more input sources, a query over the incoming stream data, and an output target.


##<a name="inputs"></a>Inputs
## Inputs

### Data stream

Expand Down Expand Up @@ -91,22 +91,22 @@ Depending on the input type used in the job, some additional fields with event m



###Additional resources
### Additional resources
For details on creating input sources, see [Azure Event Hubs developer guide][azure.event.hubs.developer.guide] and [Use Azure Blob Storage][azure.blob.storage.use].



##<a name="query"></a>Query
## Query
The logic to filter, manipulate and process incoming data is defined in the Query of Stream Analytics jobs. Queries are written using the Stream Analytics query language, a SQL-Like language that is largely a subset of standard T-SQL syntax with some specific extensions for temporal queries.

###Windowing
### Windowing
Windowing extensions allow aggregations and computations to be performed over subsets of events that fall within some period of time. Windowing functions are invoked using the GROUP BY statement. For example, the following query counts the events received per second:

SELECT Count(*)
FROM Input1
GROUP BY TumblingWindow(second, 1)

###Execution steps
### Execution steps
For more complex queries, the standard SQL clause WITH can be used to specify a temporary named result set. For example, this query uses WITH to perform a transformation with two execution steps:

WITH step1 AS (
Expand All @@ -121,7 +121,7 @@ For more complex queries, the standard SQL clause WITH can be used to specify a

To learn more about the query language, see [Azure Stream Analytics Query Language Reference][stream.analytics.query.language.reference].

##<a name="output"></a>Output
## Output
The output target is where the results of the Stream Analytics job will be written to. Results are written continuously to the output target as the job processes input events. The following output targets are supported:

- Azure Service Bus Event Hubs: Choose Event Hub as an output target for scenarios when multiple streaming pipelines need to be composed together, such as issuing commands back to devices.
Expand All @@ -130,20 +130,20 @@ The output target is where the results of the Stream Analytics job will be writt
- Azure SQL Database: This output target is appropriate for data that is relational in nature or for applications that depend on content being hosted in a database.


##<a name="scale"></a>Scale jobs
## Scale jobs

A Stream Analytics job can be scaled through configuring Streaming Units, which define the amount of processing power a job receives. Each Streaming Unit corresponds to roughly 1 MB/second of throughput. Each subscription has a quota of 12 Streaming Units per region to be allocated across jobs in that region.

For details, see [Scale Azure Stream Analytics jobs][stream.analytics.scale.jobs].


##<a name="monitor"></a>Monitor and troubleshoot jobs
## Monitor and troubleshoot jobs

###Regional monitoring storage account
### Regional monitoring storage account

To enable job monitoring, Stream Analytics requires you to designate an Azure Storage account for monitoring data in each region containing Stream Analytics jobs. This is configured at the time of job creation.

###Metrics
### Metrics
The following metrics are available for monitoring the usage and performance of Stream Analytics jobs:

- Errors: number of error messages incurred by a Stream Analytics job
Expand All @@ -152,23 +152,23 @@ The following metrics are available for monitoring the usage and performance of
- Out of order events: number of events received out of order that were either dropped or given an adjusted timestamp, based on the out of order policy
- Data conversion errors: number of data conversion errors incurred by a Stream Analytics job

###Operation logs
### Operation logs
The best approach to debugging or troubleshooting a Stream Analytics job is through Azure Operation Logs. Operation Logs can be accessed under Management Services section of the portal. To inspect logs for your job, set Service Type to "Stream Analytics" and Service Name to the name of your job.


##<a name="manage"></a>Manage jobs
## Manage jobs

###Start and stop jobs
### Start and stop jobs
When starting a job, you will be prompted to specify a Start Output value, which determines when this job will start producing resulting output. If the associated query includes a window, the job will begin picking up input from the input sources at the start of the window duration required, in order to produce the first output event at the specified time. There are three options: Job Start Time, Custom Time, and Last Stopped Time. The default setting is Job Start Time. For cases when a job has been stopped temporarily, the best practice is to choose Last Stopped Time for the Start Output value in order to resume the job from the last output time and avoid data loss. For the Custom option, you must specify a date and time. This setting is useful for specifying how much historical data in the input sources to consume or for picking up data ingestion from a specific time,

###Configure jobs
### Configure jobs
You can adjust the following top-level settings for a Stream Analytics job:

- Start output: Specifies when this job will start producing resulting output. If the associated query includes a window, the job will begin picking up input from the input sources at the start of the window duration required in order to produce the first output event at the specified time. There are two options, Job Start Time and Custom. The default setting is Job Start Time. For the Custom option, you must specify a date and time. This setting is useful for specifying how much historical data in the input sources to consume or for picking up data ingestion from a specific time, such as when a job was last stopped.
- Out of order policy: Settings for handling events that do not arrive to the Stream Analytics job sequentially. You can designate a time threshold to reorder events within by specifying a Tolerance Window and also determine an action to take on events outside this window: Drop or Adjust. Drop will drop all events received out of order and Adjust will change the System.Timestamp of out of order events to the timestamp of the most recently received ordered event.
- Locale: Use this setting to specify the internationalization preference for the stream analytics job. While timestamps of data are locale neutral, settings here impact how the job will parse, compare, and sort data. For the preview release, only en-US is supported.

###Status
### Status

The status of Stream Analytics jobs can be inspected in the Azure portal. Running jobs can be in one of three states: Idle, Processing, Degraded. The definition for each of these states is below:

Expand All @@ -177,11 +177,11 @@ The status of Stream Analytics jobs can be inspected in the Azure portal. Runnin
- Degraded: This state indicates that a Stream Analytics job is encountering one of the following errors: Input/output communication errors, query errors, retry-able run-time errors. To distinguish what type of error(s) the job is encountering, view the Operation Logs.


##<a name="support"></a>Get support
## Get support
For additional support, see [Azure Stream Analytics forum][stream.analytics.forum].


##<a name="nextsteps"></a>Next steps
## Next steps

- [Introduction to Azure Stream Analytics][stream.analytics.introduction]
- [Get started using Azure Stream Analytics][stream.analytics.get.started]
Expand Down
26 changes: 13 additions & 13 deletions articles/stream-analytics-dotnet-management-sdk.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ Azure Stream Analytics is a fully managed service providing low latency, highly
This article demonstrates how to use the Azure Stream Analytics Management .NET SDK.


##<a name="inputs"></a>Prerequisites
## Prerequisites
Before you begin this article, you must have the following:

- Visual Studio 2012 or 2013.
Expand All @@ -48,7 +48,7 @@ Before you begin this article, you must have the following:
4. This article assumes you have already set up an input source and output target to use. See [Get Started Using Azure Stream Analytics](http://azure.microsoft.com/documentation/articles/stream-analytics-get-started/) to set up a sample input and/or output to be used by this article.


##<a name="setupproject"></a>Setup a project
## Setup a project

1. Create a Visual Studio C# .Net console application.
2. In the Package Manager Console, run the following commands to install the NuGet packages. The first one is the Azure Stream Analytics Management .NET SDK. The second one is the Azure Active Directory client that will be used for authentication.
Expand Down Expand Up @@ -123,7 +123,7 @@ Before you begin this article, you must have the following:
}


##<a name="createclient"></a>Create a Stream Analytics management client
## Create a Stream Analytics management client

A *StreamAnalyticsManagementClient* object allows you to manage the job, and the job components such as inputs, outputs, and transformation.

Expand All @@ -148,7 +148,7 @@ The resourceGroupName variable's value should be the same as the name of the res

The remaining sections of this article assume that this code is at the beginning of the Main method.

##<a name="createjob"></a>Create a Stream Analytics Job
## Create a Stream Analytics Job

The following code creates a Stream Analytics job under the resource group that you have defined. You will add inputs, output, and transformation to the job later.

Expand All @@ -173,7 +173,7 @@ The following code creates a Stream Analytics job under the resource group that
JobCreateOrUpdateResponse jobCreateResponse = client.StreamingJobs.CreateOrUpdate(resourceGroupName, jobCreateParameters);


##<a name="createinput"></a>Create a Stream Analytics Input Source
## Create a Stream Analytics Input Source

The following code creates a Stream Analytics input source with the Blob input source type and CSV serialization. To create an Event Hub input source, use *EventHubStreamInputDataSource* instead of *BlobStreamInputDataSource*. Similarly, you can customize the serialization type of the input source.

Expand Down Expand Up @@ -219,15 +219,15 @@ The following code creates a Stream Analytics input source with the Blob input s
Input sources are tied to a specific job. To use the same input source for different jobs, you must call the method again specifying a different job name.


##<a name="testinput"></a>Test a Stream Analytics Input Source
## Test a Stream Analytics Input Source

The *TestConnection* method tests whether the Stream Analytics job is able to connect to the input source as well as other aspects specific to the input source type. For example, in the blob input source you created in an earlier step, the method will check that the Storage account name and key pair can be used to connect to the storage account as well as check that the container specified exists.

// Test input source connection
DataSourceTestConnectionResponse inputTestResponse =
client.Inputs.TestConnection(resourceGroupName, streamAnalyticsJobName, streamAnalyticsInputName);

##<a name="createoutput"></a>Create a Stream Analytics output target
## Create a Stream Analytics output target

Creating an output target is very similar to creating a Stream Analytics input source. Like input sources, output targets are tied to a specific job. To use the same output target for different jobs, you must call the method again specifying a different job name.

Expand Down Expand Up @@ -259,15 +259,15 @@ The following code creates a SQL output target. You can customize the output tar
OutputCreateOrUpdateResponse outputCreateResponse =
client.Outputs.CreateOrUpdate(resourceGroupName, streamAnalyticsJobName, jobOutputCreateParameters);

##<a name="testoutput"></a>Test a Stream Analytics output target
## Test a Stream Analytics output target

Stream Analytics output target also has the TestConnection method for testing connections.

// Test output target connection
DataSourceTestConnectionResponse outputTestResponse =
client.Outputs.TestConnection(resourceGroupName, streamAnalyticsJobName, streamAnalyticsOutputName);

##<a name="createtransform"></a>Create a Stream Analytics Transformation
## Create a Stream Analytics Transformation

The following code creates a Stream Analytics transformation with the query “select * from Input” and specifies to allocate one streaming unit for the Stream Analytics job. For more information on adjusting streaming unit, see [Scale Azure Stream Analytics jobs](http://azure.microsoft.com/documentation/articles/stream-analytics-scale-jobs/).

Expand All @@ -290,7 +290,7 @@ The following code creates a Stream Analytics transformation with the query “s

Like inputs and outputs, transformations are also tied to the specific Stream Analytics job it was created under.

##<a name="startjob"></a>Start a Stream Analytics Job
## Start a Stream Analytics Job
After creating a Stream Analytics job and its input(s), output(s), and transformation, you can start the job by calling the Start method.

The following sample code starts a Stream Analytics job with a custom output start time set to December 12, 2012 12:12:12 UTC.
Expand All @@ -306,20 +306,20 @@ The following sample code starts a Stream Analytics job with a custom output sta



##<a name="stopjob"></a>Stop a Stream Analytics Job
## Stop a Stream Analytics Job
You can stop a running Stream Analytics job by calling the Stop method.

// Stop a Stream Analytics job
LongRunningOperationResponse jobStopResponse = client.StreamingJobs.Stop(resourceGroupName, streamAnalyticsJobName);

##<a name="deletejob"></a>Delete a Stream Analytics Job
## Delete a Stream Analytics Job
The delete method will delete the job as well as the underlying sub-resources, including input(s), output(s), and transformation of the job.

// Delete a Stream Analytics job
LongRunningOperationResponse jobDeleteResponse = client.StreamingJobs.Delete(resourceGroupName, streamAnalyticsJobName);


##<a name="nextsteps"></a>Next steps
## Next steps

- [Introduction to Azure Stream Analytics][stream.analytics.introduction]
- [Get started using Azure Stream Analytics][stream.analytics.get.started]
Expand Down
Loading

0 comments on commit f4884db

Please sign in to comment.