Skip to content

Latest commit

 

History

History
103 lines (71 loc) · 5.29 KB

event-hubs-quickstart-kafka-enabled-event-hubs.md

File metadata and controls

103 lines (71 loc) · 5.29 KB
title description services author ms.author ms.service ms.topic ms.custom ms.date
Data streaming with Azure Event Hubs using the Kafka protocol | Microsoft Docs
This article provides information on how to stream into Azure Event Hubs using the Kafka protocol and APIs.
event-hubs
basilhariri
bahariri
event-hubs
quickstart
seodec18
05/06/2019

Data streaming with Event Hubs using the Kafka protocol

This quickstart shows how to stream into Kafka-enabled Event Hubs without changing your protocol clients or running your own clusters. You learn how to use your producers and consumers to talk to Kafka-enabled Event Hubs with just a configuration change in your applications. Azure Event Hubs supports Apache Kafka version 1.0.

Note

This sample is available on GitHub

Prerequisites

To complete this quickstart, make sure you have the following prerequisites:

Create a Kafka enabled Event Hubs namespace

  1. Sign in to the Azure portal, and click Create a resource at the top left of the screen.

  2. Search for Event Hubs and select the options shown here:

    Search for Event Hubs in the portal

  3. Provide a unique name and enable Kafka on the namespace. Click Create. Note: Event Hubs for Kafka is only supported by Standard and Dedicated tier Event Hubs. Basic tier Event Hubs will return a Topic Authorization Error in response to any Kafka operations.

    Create a namespace

  4. Once the namespace is created, on the Settings tab click Shared access policies to get the connection string.

    Click Shared access policies

  5. You can choose the default RootManageSharedAccessKey, or add a new policy. Click the policy name and copy the connection string.

    Select a policy

  6. Add this connection string to your Kafka application configuration.

You can now stream events from your applications that use the Kafka protocol into Event Hubs.

Send and receive messages with Kafka in Event Hubs

  1. Clone the Azure Event Hubs for Kafka repository.

  2. Navigate to azure-event-hubs-for-kafka/quickstart/java/producer.

  3. Update the configuration details for the producer in src/main/resources/producer.config as follows:

    bootstrap.servers={YOUR.EVENTHUBS.FQDN}:9093
    security.protocol=SASL_SSL
    sasl.mechanism=PLAIN
    sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="$ConnectionString" password="{YOUR.EVENTHUBS.CONNECTION.STRING}";
  4. Run the producer code and stream into Kafka-enabled Event Hubs:

    mvn clean package
    mvn exec:java -Dexec.mainClass="TestProducer"                                    
  5. Navigate to azure-event-hubs-for-kafka/quickstart/java/consumer.

  6. Update the configuration details for the consumer in src/main/resources/consumer.config as follows:

    bootstrap.servers={YOUR.EVENTHUBS.FQDN}:9093
    security.protocol=SASL_SSL
    sasl.mechanism=PLAIN
    sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="$ConnectionString" password="{YOUR.EVENTHUBS.CONNECTION.STRING}";
  7. Run the consumer code and process from Kafka enabled Event Hubs using your Kafka clients:

    mvn clean package
    mvn exec:java -Dexec.mainClass="TestConsumer"                                    

If your Event Hubs Kafka cluster has events, you now start receiving them from the consumer.

Next steps

In this article, you learned how to stream into Kafka-enabled Event Hubs without changing your protocol clients or running your own clusters. To learn more, continue with the following tutorial: