Playground for Kafka/Confluent Docker experimentations...
Quick start examples from Confluent docs but in Docker version for ease of use.
* You can change default connector version by setting CONNECTOR_TAG
environment variable before starting a test, get more details here
- How to connect your components to Confluent Cloud
- How to monitor your Confluent Cloud cluster
- How to restrict access
- etc...
AWS Kinesis source
ServiceNow source
ServiceNow sink
MongoDB source
Firebase
- Using cp-ansible with Confluent Cloud
- Using cp-helm-charts with Confluent Cloud
- Using confluent operator with Confluent Cloud
- Demo using dabz/ccloudexporter in order to pull Metrics API data from Confluent Cloud cluster and export it to Prometheus (Grafana dashboard is also available)
.NET client (producer/consumer)
Go client (producer/consumer)
KafkaJS client (producer/consumer)
Python client (producer/consumer)
kafka-admin Managing topics and ACLs using matt-mangia/kafka-admin
Confluent Replicator OnPrem to cloud and Cloud to Cloud examples
Multi-Cluster Schema Registry with hybrid configuration (onprem/confluent cloud)
- Confluent REST Proxy Security Plugin with Principal Propagation
- Confluent Schema Registry Security Plugin
- Migrate Schemas to Confluent Cloud using Confluent Replicator
- Confluent Cloud Networking using HAProxy
- Multiple Event Types in the Same Topic
Using Multi-Data-Center setup with US
πΊπΈ and EUROPE
πͺπΊ clusters.
Using Confluent Replicator as connector
- Using PLAINTEXT
- Using SASL_PLAIN
- Using Kerberos
- πΎ Using Confluent Replicator as executable
- Using PLAINTEXT
- Using SASL_PLAIN
- Using Kerberos
Using Mirror Maker 2
- Using PLAINTEXT
Single cluster:
- PLAINTEXT: no security
- SASL_PLAIN: no SSL encryption, SASL/PLAIN authentication
- SASL/SCRAM no SSL encryption, SASL/SCRAM-SHA-256 authentication
- SASL_SSL: SSL encryption, SASL/PLAIN authentication
- 2WAY_SSL: SSL encryption, SSL authentication
- Kerberos: no SSL encryption, Kerberos GSSAPI authentication
- SSL_Kerberos SSL encryption, Kerberos GSSAPI authentication
- LDAP Authentication with SASL/PLAIN no SSL encryption, SASL/PLAIN authentication using LDAP
- LDAP Authorizer with SASL/PLAIN no SSL encryption, SASL/PLAIN authentication, LDAP Authorizer for ACL authorization
- RBAC with SASL/PLAIN RBAC with no SSL encryption, SASL/PLAIN authentication
Multi-Data-Center setup:
- PLAINTEXT: no security
- SASL_PLAIN: no SSL encryption, SASL/PLAIN authentication
- Kerberos: no SSL encryption, Kerberos GSSAPI authentication
- Control Center
- Tiered Storage
- Tiered storage with AWS S3
- Tiered storage with Minio (unsupported)
- Confluent Rebalancer
- JMS Client
- RBAC with SASL/PLAIN RBAC with no SSL encryption, SASL/PLAIN authentication
- Audit Logs
- Confluent REST Proxy Security Plugin with SASL_SSL and 2WAY_SSL Principal Propagation
- Cluster Linking
Easily play with Confluent Platform Ansible playbooks by using Ubuntu based Docker images generated daily from this cp-ansible-playground repository
There is also a Confluent Cloud version available here
- Confluent Replicator [also with SASL_SSL and 2WAY_SSL]
- Testing Separate principals (
connector.client.config.override.policy
) for Source connector (SFTP source) - Testing Separate principals (
connector.client.config.override.policy
) for Source connector (SFTP sink) - How to write logs to files when using docker-compose
- Publish logs to kafka with Elastic Filebeat
.NET basic producer
KafkaJS client (producer/consumer)
Monitor Confluent Platform with Datadog
- Testing KIP-108 Create Topic Policy
You just need to have docker and docker-compose installed on your machine !
Every command used in the playground is using Docker, this includes jq
(except if you have it on your host already), aws
, az
, gcloud
, etc..
The goal is to have a consistent behaviour and only depends on Docker.
If you want to run it on EC2 instance (highly recommended if you have low internet bandwith), you can use the AWS CloudFormation template provided here.
For example, this is how I start it using aws CLI:
$ cp /path/to/kafka-docker-playground/cloudformation/kafka-docker-playground.json tmp.json
$ aws cloudformation create-stack --stack-name kafka-docker-playground-$USER --template-body file://tmp.json --region eu-west-3
--parameters ParameterKey=KeyName,ParameterValue=$KEY_NAME ParameterKey=InstanceName,ParameterValue=kafka-docker-playground-$USER