title | description | services | author | ms.author | ms.service | ms.topic | ms.custom | ms.date |
---|---|---|---|---|---|---|---|---|
Data streaming with Azure Event Hubs using the Kafka protocol | Microsoft Docs |
This article provides information on how to stream into Azure Event Hubs using the Kafka protocol and APIs. |
event-hubs |
basilhariri |
bahariri |
event-hubs |
quickstart |
seodec18 |
05/06/2019 |
This quickstart shows how to stream into Kafka-enabled Event Hubs without changing your protocol clients or running your own clusters. You learn how to use your producers and consumers to talk to Kafka-enabled Event Hubs with just a configuration change in your applications. Azure Event Hubs supports Apache Kafka version 1.0.
Note
This sample is available on GitHub
To complete this quickstart, make sure you have the following prerequisites:
- Read through the Event Hubs for Apache Kafka article.
- An Azure subscription. If you do not have one, create a free account before you begin.
- Java Development Kit (JDK) 1.7+.
- Download and install a Maven binary archive.
- Git
- A Kafka enabled Event Hubs namespace
-
Sign in to the Azure portal, and click Create a resource at the top left of the screen.
-
Search for Event Hubs and select the options shown here:
-
Provide a unique name and enable Kafka on the namespace. Click Create. Note: Event Hubs for Kafka is only supported by Standard and Dedicated tier Event Hubs. Basic tier Event Hubs will return a Topic Authorization Error in response to any Kafka operations.
-
Once the namespace is created, on the Settings tab click Shared access policies to get the connection string.
-
You can choose the default RootManageSharedAccessKey, or add a new policy. Click the policy name and copy the connection string.
-
Add this connection string to your Kafka application configuration.
You can now stream events from your applications that use the Kafka protocol into Event Hubs.
-
Clone the Azure Event Hubs for Kafka repository.
-
Navigate to
azure-event-hubs-for-kafka/quickstart/java/producer
. -
Update the configuration details for the producer in
src/main/resources/producer.config
as follows:bootstrap.servers={YOUR.EVENTHUBS.FQDN}:9093 security.protocol=SASL_SSL sasl.mechanism=PLAIN sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="$ConnectionString" password="{YOUR.EVENTHUBS.CONNECTION.STRING}";
-
Run the producer code and stream into Kafka-enabled Event Hubs:
mvn clean package mvn exec:java -Dexec.mainClass="TestProducer"
-
Navigate to
azure-event-hubs-for-kafka/quickstart/java/consumer
. -
Update the configuration details for the consumer in
src/main/resources/consumer.config
as follows:bootstrap.servers={YOUR.EVENTHUBS.FQDN}:9093 security.protocol=SASL_SSL sasl.mechanism=PLAIN sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="$ConnectionString" password="{YOUR.EVENTHUBS.CONNECTION.STRING}";
-
Run the consumer code and process from Kafka enabled Event Hubs using your Kafka clients:
mvn clean package mvn exec:java -Dexec.mainClass="TestConsumer"
If your Event Hubs Kafka cluster has events, you now start receiving them from the consumer.
In this article, you learned how to stream into Kafka-enabled Event Hubs without changing your protocol clients or running your own clusters. To learn more, continue with the following tutorial:
- Learn about Event Hubs
- Learn about Event Hubs for Kafka
- Explore more samples on the Event Hubs for Kafka GitHub
- Use MirrorMaker to stream events from Kafka on premises to Kafka enabled Event Hubs on cloud.
- Learn how to stream into Kafka enabled Event Hubs using Apache Flink or Akka Streams