This demo will exercise various ways to integrate Spring Boot applications with Apache Kafka®, specifically Confluent Cloud.
The Spring for Apache Kafka (spring-kafka) project applies core Spring concepts to the development of Kafka-based solutions. It provides a "template" as a high-level abstraction for sending messages. It also provides support for Message-driven POJOs with @KafkaListener
annotations and a "listener container". These libraries promote the use of dependency injection and declarative programming.
-
JDK 21 (I strongly suggest sdkman.)
-
Confluent Cloud account
-
IDE of choice. For Kotlin, that is likely IntelliJ IDEA (Community Edition will suffice).
Once you have registered for a Confluent Cloud account, we’ll be ready create the infrastructure for this demo. Here we’ll use the Confluent Terraform Provider to provision assets in Confluent Cloud.
From your terminal, change into the terraform
subdirectory and run the following commands:
terraform init
terraform plan -out "tfplan"
terraform apply "tfplan"
You can validate your Confluent Cloud assets are provisioned via the Confluent Cloud Console or with Confluent CLI. Should you encounter an issue at of the steps above, please validate your Confluent CLI setup (specifically credentials) and PATH
variables.
The code in this demo relies on the assets you just provisioned, and the configuration of this Spring Boot application needs the credentials and endpoints we just provisioned. Let’s export those to a properties file in the USER_HOME
directory to later usage:
terraform output -json | jq -r 'to_entries[] | .key + "=" + (.value.value | tostring)' | while read -r line ; do echo "$line"; done > . ~/tools/spring-into-cc.properties
cat ~/tools/spring-into-cc.properties
CC_BROKER=<REDACTED>.confluent.cloud:9092
CC_BROKER_URL=https://<REDACTED>.confluent.cloud:443
CC_ENV_DISPLAY_NAME=spring-into-cc
CC_ENV_ID=<REDACTED>
CC_KAFKA_CLUSTER_ID=<REDACTED>
CC_SCHEMA_REGISTRY_ID=<REDACTED>
CC_SCHEMA_REGISTRY_URL=https://<REDACTED>.confluent.cloud
KAFKA_KEY_ID=<REDACTED>
KAFKA_KEY_SECRET=<REDACTED>
SCHEMA_REGISTRY_KEY_ID=<REDACTED>
SCHEMA_REGISTRY_KEY_SECRET=<REDACTED>
Once you’re done, you can destroy the Confluent Cloud environment using Terraform from the terraform
directory as follows:
terraform destroy -auto-approve