Go to Confluent’s Getting Started guide.
-
You may select any of the listed programming languages.
-
In the Kafka Setup section, set Kafka Location to "Other".
-
Bootstrap Server URL:
0.tcp.eu.ngrok.io:19589
-
🚨
|
Changing Kafka Location is needed in order to generate your config file correctly. If you forget to change it, you will get SSL/TLS related errors later. |
💡
|
You can run the following command to verify that you are able to connect to the Kafka Cluster: nc -vz 0.tcp.eu.ngrok.io 19589 |
-
Go to this Kafka UI to create a topic.
-
Set Number of partitions to 2.
-
Since everyone is connecting to the same cluster, you can make up your own unique topic name to avoid colliding with others.
-
-
Follow the rest of the Getting Started guide to produce and consume your own events.
-
Remember to substitute with your own topic name where applicable.
-
💡
|
You can check your topic’s partitions, messages and consumers in the UI. |
-
Try running two consumers at the same time, and then produce some messages. How are the messages distributed among the consumers?
Consume the topic "public.nmp-kafka-workshop.announcements"
.
This topic contains data encoded using Apache Avro, with the following schema:
{
"namespace": "examples.avro",
"type": "record",
"name": "Announcement",
"fields": [
{
"name": "author",
"type": "string"
},
{
"name": "message",
"type": "string"
}
]
}
Use the following Schema Registry URL: http://8a5e-80-91-33-134.eu.ngrok.io
Solving this task will require you to do some digging on your own.
For some languages you have the choice between deserializing into a generic record (i.e. a hash map or dictionary) or a specific record (i.e. a data type generated from the schema).
Some pre-generated data types for the Announcement
record are available in this repository.