Skip to content

Commit

Permalink
DEVX-1783: Generalize client programming languages examples (confluen…
Browse files Browse the repository at this point in the history
  • Loading branch information
ybyzek authored May 7, 2020
1 parent 8766fb5 commit 7eca640
Show file tree
Hide file tree
Showing 20 changed files with 80 additions and 82 deletions.
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ The best demo to start with is [cp-demo](https://github.com/confluentinc/cp-demo
| Demo | Local | Docker | Description
| ------------------------------------------ | ----- | ------ | ---------------------------------------------------------------------------
| [Confluent Cloud CLI](ccloud/beginner-cloud/README.md#confluent-cloud-cli) | Y | N | Fully automated demo interacting with your Confluent Cloud cluster using Confluent Cloud CLI <br><img src="clients/cloud/images/confluent-cli.png" width="300">
| [Clients to Cloud](clients/cloud/README.md) | [Y](clients/cloud/README.md) | N | Client applications in different programming languages connecting to [Confluent Cloud](https://www.confluent.io/confluent-cloud/?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.top) <br><img src="clients/cloud/images/clients-all.png" width="450">
| [Clients in Various Languages to Cloud](clients/cloud/README.md) | [Y](clients/cloud/README.md) | N | Client applications, showcasing producers and consumers, in various programming languages connecting to [Confluent Cloud](https://www.confluent.io/confluent-cloud/?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.top) <br><img src="clients/cloud/images/clients-all.png" width="450">
| [Cloud ETL](cloud-etl/README.md) | [Y](cloud-etl/README.md) | N | Cloud ETL solution using fully-managed Confluent Cloud connectors and fully-managed ksqlDB <br><img src="cloud-etl/docs/images/topology.png" width="450">
| [Fully Managed Stack](ccloud/ccloud-stack/README.md) | Y | N | Creates a fully-managed stack in Confluent Cloud, including a new environment, service account, Kafka cluster, KSQL app, Schema Registry, and ACLs. The demo also generates a config file for use with client applications.<br><img src="clients/cloud/images/confluent-cloud.png" width="300">
| [On-Prem Kafka to Cloud](ccloud/README.md) | [Y](ccloud/README.md) | [Y](ccloud/README.md) | This more advanced demo showcases an on-prem Kafka cluster and [Confluent Cloud](https://www.confluent.io/confluent-cloud/?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.top) cluster, and data copied between them with Confluent Replicator <br><img src="ccloud/docs/images/services-in-cloud.jpg" width="450">
Expand All @@ -52,7 +52,7 @@ The best demo to start with is [cp-demo](https://github.com/confluentinc/cp-demo

| Demo | Local | Docker | Description
| ------------------------------------------ | ----- | ------ | ---------------------------------------------------------------------------
| [Clients](clients/cloud/README.md) | [Y](clients/cloud/README.md) | N | Client applications in different programming languages <br><img src="clients/cloud/images/clients-all.png" width="450">
| [Clients in Various Languages](clients/cloud/README.md) | [Y](clients/cloud/README.md) | N | Client applications, showcasing producers and consumers, in various programming languages <br><img src="clients/cloud/images/clients-all.png" width="450">
| [Connect and Kafka Streams](connect-streams-pipeline/README.md) | [Y](connect-streams-pipeline/README.md) | N | Demonstrate various ways, with and without Kafka Connect, to get data into Kafka topics and then loaded for use by the Kafka Streams API <br><img src="connect-streams-pipeline/images/blog_connect_streams_diag.jpg" width="450">


Expand Down
2 changes: 1 addition & 1 deletion clients/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

# Basic Producers and Consumers

* [Various programming languages](cloud/README.md)
* [Clients in various programming languages](cloud/README.md): run client examples to a Kafka cluster on your local host, [Confluent Cloud](https://www.confluent.io/confluent-cloud/?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.clients-ccloud), or any other cluster.

<a href="cloud/README.md" target="_blank"><img src="cloud/images/clients-all.png" width="600"></a>

Expand Down
22 changes: 10 additions & 12 deletions clients/cloud/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,9 +2,15 @@

## Programming Languages

This directory includes examples of Kafka client applications written in different languages.
This directory includes examples of Kafka client applications, showcasing producers and consumers, written in various programming languages.
The README for each language walks through the necessary steps to run each example.
Each client example takes as an argument a properties file with the configuration parameters that specify connection information for any Kafka cluster: this can be on your local host, [Confluent Cloud](https://www.confluent.io/confluent-cloud/?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.clients-ccloud), or any other cluster.
Each client example takes as an argument a properties file with the configuration parameters that specify connection information for any of the following:

* Kafka cluster running on your local host (Download [Confluent Platform](https://www.confluent.io/download/?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.clients-ccloud))
* [Confluent Cloud](https://www.confluent.io/confluent-cloud/?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.clients-ccloud)
* Any other remote Kafka cluster

Click on any language in the table below:

| | | |
|:---------------------------------:|:-----------------------------------------------:|:---------------------------------:|
Expand All @@ -15,19 +21,11 @@ Each client example takes as an argument a properties file with the configuratio
| [![](images/kafka-connect-datagen.png)](kafka-connect-datagen/) | [![](images/ksql-datagen.png)](ksql-datagen/) | [![](images/rust.png)](rust/) |
| [![](images/kafka.png)](kafka-commands/) | [![](images/clojure.png)](clojure/) | [![](images/springboot.png)](java-springboot/) |

## Confluent Cloud and Confluent Cloud Schema Registry
## With Schema Registry

The following subset includes examples with Confluent Schema Registry and Avro data:
The following subset includes examples with Schema Registry and Avro data:

| | | |
|:---------------------------------:|:-----------------------------------------------:|:---------------------------------:|
| [![](images/java.png)](java/) | [![](images/python.png)](python/) | [![](images/confluent-cli.png)](confluent-cli/) |
| [![](images/kafka-connect-datagen.png)](kafka-connect-datagen/) | [![](images/ksql-datagen.png)](ksql-datagen/) | [![](images/kafka.png)](kafka-commands/) |

## Other Confluent Cloud Demos

You may also run an [end-to-end Confluent Cloud demo](https://docs.confluent.io/current/tutorials/examples/ccloud/docs/index.html?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.clients-ccloud) that showcases hybrid Kafka clusters: moving from self-managed to Confluent Cloud, with other streaming processing applications (e.g. ksqlDB) and components of the Confluent Platform (e.g. Confluent Replicator, Confluent Control Center, and Confluent Schema Registry).

## Confluent Cloud

[![](images/confluent-cloud.png)](https://www.confluent.io/confluent-cloud/?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.clients-ccloud)
8 changes: 4 additions & 4 deletions clients/cloud/c/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,11 +20,11 @@ cc producer.c common.c json.c -o producer -lrdkafka -lm

# Example 1: Hello World!

In this example, the producer writes JSON data to a topic in Confluent Cloud.
In this example, the producer writes JSON data to a topic in your Kafka cluster.
Each record has a key representing a username (e.g. `alice`) and a value of a count, formatted as json (e.g. `{"count": 0}`).
The consumer reads the same topic from Confluent Cloud and keeps a rolling sum of the counts as it processes each record.
The consumer reads the same topic and keeps a rolling sum of the counts as it processes each record.

1. Run the producer, passing in arguments for (a) the topic name, and (b) the local file with configuration parameters to connect to your Confluent Cloud instance:
1. Run the producer, passing in arguments for (a) the topic name, and (b) the local file with configuration parameters to connect to your Kafka cluster:

```bash
$ ./producer test1 $HOME/.confluent/librdkafka.config
Expand Down Expand Up @@ -55,7 +55,7 @@ Message delivered to test1 [0] at offset 9 in 22.81ms: { "count": 10 }
```
2. Run the consumer, passing in arguments for (a) the same topic name as used above, (b) the local file with configuration parameters to connect to your Confluent Cloud instance. Verify that the consumer received all the messages, then press Ctrl-C to exit.
2. Run the consumer, passing in arguments for (a) the same topic name as used above, (b) the local file with configuration parameters to connect to your Kafka cluster. Verify that the consumer received all the messages, then press Ctrl-C to exit.
```bash
$ ./consumer test1 $HOME/.confluent/librdkafka.config
Expand Down
8 changes: 4 additions & 4 deletions clients/cloud/clojure/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,11 +13,11 @@ For more information, please see the [application development documentation](htt

# Example 1: Hello World!

In this example, the producer writes Kafka data to a topic in Confluent Cloud.
In this example, the producer writes Kafka data to a topic in your Kafka cluster.
Each record has a key representing a username (e.g. `alice`) and a value of a count, formatted as json (e.g. `{"count": 0}`).
The consumer reads the same topic from Confluent Cloud and keeps a rolling sum of the counts as it processes each record.
The consumer reads the same topic and keeps a rolling sum of the counts as it processes each record.

1. Run the producer, passing in arguments for (a) the local file with configuration parameters to connect to your Confluent Cloud instance and (b) the topic name:
1. Run the producer, passing in arguments for (a) the local file with configuration parameters to connect to your Kafka cluster and (b) the topic name:

```shell
$ lein producer $HOME/.confluent/java.config test1
Expand Down Expand Up @@ -45,7 +45,7 @@ Produced record to topic test1 partiton [0] @ offest 9
10 messages were produced to topic test1!
```

2. Run the consumer, passing in arguments for (a) the local file with configuration parameters to connect to your Confluent Cloud instance and (b) the same topic name as used above. Verify that the consumer received all the messages:
2. Run the consumer, passing in arguments for (a) the local file with configuration parameters to connect to your Kafka cluster and (b) the same topic name as used above. Verify that the consumer received all the messages:

```shell
$ lein consumer $HOME/.confluent/java.config test1
Expand Down
4 changes: 2 additions & 2 deletions clients/cloud/confluent-cli/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,9 +15,9 @@ Produce messages to and consume messages from a Kafka cluster using [Confluent C

# Example 1: Hello World!

In this example, the producer writes Kafka data to a topic in Confluent Cloud.
In this example, the producer writes Kafka data to a topic in your Kafka cluster.
Each record has a key representing a username (e.g. `alice`) and a value of a count, formatted as json (e.g. `{"count": 0}`).
The consumer reads the same topic from Confluent Cloud.
The consumer reads the same topic.

1. Create the topic in Confluent Cloud

Expand Down
8 changes: 4 additions & 4 deletions clients/cloud/csharp/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,13 +11,13 @@ Produce messages to and consume messages from a Kafka cluster using the .NET Pro

# Example

In this example, the producer writes records to a topic in Confluent Cloud.
In this example, the producer writes records to a topic in your Kafka cluster.
Each record has a key representing a username (e.g. `alice`) and a value of a count, formatted as json (e.g. `{"count": 0}`).
The consumer reads the same topic from Confluent Cloud and keeps a rolling sum of the counts as it processes each record.
The consumer reads the same topic and keeps a rolling sum of the counts as it processes each record.

## Produce Records

Run the example application, passing in arguments for (a) whether to produce or consume (produce) (b) the topic name (c) the local file with configuration parameters to connect to your Confluent Cloud instance and (d, Windows only) a local file with default trusted root CA certificates.
Run the example application, passing in arguments for (a) whether to produce or consume (produce) (b) the topic name (c) the local file with configuration parameters to connect to your Kafka cluster and (d, Windows only) a local file with default trusted root CA certificates.

> Note: On Windows, default trusted root CA certificates - which are required for secure access to Confluent Cloud - are stored in the Windows Registry. The .NET library does not currently have the capability to access these certificates, so you will need to obtain them from somewhere else, for example use the cacert.pem file distributed with curl: https://curl.haxx.se/ca/cacert.pem.
Expand Down Expand Up @@ -61,7 +61,7 @@ Produced record to topic test1 partition [0] @ offset 9

## Consume Records

Run the consumer, passing in arguments for (a) whether to produce or consume (consume) (b) the same topic name as used above (c) the local file with configuration parameters to connect to your Confluent Cloud instance and (d, Windows only) a local file with default trusted root CA certificates. Verify that the consumer received all the messages:
Run the consumer, passing in arguments for (a) whether to produce or consume (consume) (b) the same topic name as used above (c) the local file with configuration parameters to connect to your Kafka cluster and (d, Windows only) a local file with default trusted root CA certificates. Verify that the consumer received all the messages:

```shell
# Run the consumer (Windows)
Expand Down
8 changes: 4 additions & 4 deletions clients/cloud/go/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,11 +10,11 @@ Produce messages to and consume messages from a Kafka cluster using [Confluent G

# Example 1: Hello World!

In this example, the producer writes Kafka data to a topic in Confluent Cloud.
In this example, the producer writes Kafka data to a topic in your Kafka cluster.
Each record has a key representing a username (e.g. `alice`) and a value of a count, formatted as json (e.g. `{"Count": 0}`).
The consumer reads the same topic from Confluent Cloud and keeps a rolling sum of the counts as it processes each record.
The consumer reads the same topic and keeps a rolling sum of the counts as it processes each record.

1. Run the producer, passing in arguments for (a) the local file with configuration parameters to connect to your Confluent Cloud instance and (b) the topic name:
1. Run the producer, passing in arguments for (a) the local file with configuration parameters to connect to your Kafka cluster and (b) the topic name:

```bash
$ go build producer.go
Expand Down Expand Up @@ -42,7 +42,7 @@ Successfully produced record to topic test1 partition [0] @ offset 9
10 messages were produced to topic test1!
```

2. Run the consumer, passing in arguments for (a) the local file with configuration parameters to connect to your Confluent Cloud instance and (b) the same topic name as used above. Verify that the consumer received all the messages:
2. Run the consumer, passing in arguments for (a) the local file with configuration parameters to connect to your Kafka cluster and (b) the same topic name as used above. Verify that the consumer received all the messages:

```bash
$ go build consumer.go
Expand Down
12 changes: 6 additions & 6 deletions clients/cloud/groovy/README.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -10,12 +10,12 @@ Produce messages to and consume messages from a Kafka cluster using the Groovy v

== Example 1: Hello World!

In this example, the producer writes Kafka data to a topic in Confluent Cloud.
In this example, the producer writes Kafka data to a topic in your Kafka cluster.
Each record has a key representing a username (e.g. `alice`) and a value of a count, formatted as json (e.g. `{"count": 0}`).
The consumer reads the same topic from Confluent Cloud and keeps a rolling sum of the counts as it processes each record.
The Kafka Streams API reads the same topic from Confluent Cloud and does a stateful sum aggregation, also a rolling sum of the counts as it processes each record.
The consumer reads the same topic and keeps a rolling sum of the counts as it processes each record.
The Kafka Streams API reads the same topic and does a stateful sum aggregation, also a rolling sum of the counts as it processes each record.

. Run the producer, passing in arguments for (a) the local file with configuration parameters to connect to your Confluent Cloud instance and (b) the topic name:
. Run the producer, passing in arguments for (a) the local file with configuration parameters to connect to your Kafka cluster and (b) the topic name:

+
[source,shell]
Expand Down Expand Up @@ -60,7 +60,7 @@ You should see:
...
----

. Run the consumer, passing in arguments for (a) the local file with configuration parameters to connect to your Confluent Cloud instance and (b) the same topic name as used above.
. Run the consumer, passing in arguments for (a) the local file with configuration parameters to connect to your Kafka cluster and (b) the same topic name as used above.
Verify that the consumer received all the messages:

+
Expand Down Expand Up @@ -93,7 +93,7 @@ You should see:
----
When you are done, press `<ctrl>-c`.

. Run the Kafka Streams application, , passing in arguments for (a) the local file with configuration parameters to connect to your Confluent Cloud instance and (b) the same topic name as used above.
. Run the Kafka Streams application, , passing in arguments for (a) the local file with configuration parameters to connect to your Kafka cluster and (b) the same topic name as used above.
Verify that the consumer received all the messages:
+
....
Expand Down
6 changes: 3 additions & 3 deletions clients/cloud/java-springboot/README.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ $ curl -u <SR API KEY>:<SR API SECRET> https://<SR ENDPOINT>/subjects

This Spring Boot application has two components - Producer (`ProducerExample.java`) and Consumer (`ConsumerExample.java`).
Both components will be initialized during the Spring Boot application startup.
The producer writes Kafka data to a topic in Confluent Cloud.
The producer writes Kafka data to a topic in your Kafka cluster.
Each record has a String key representing a username (e.g. `alice`) and a value of a count, formatted as Avro object

[source,json]
Expand All @@ -64,7 +64,7 @@ Each record has a String key representing a username (e.g. `alice`) and a value
----

This command will build jar and executes `spring-kafka` powered producer and consumer.
The consumer reads the same topic from Confluent Cloud and prints data to the console.
The consumer reads the same topic and prints data to the console.

You should see following in the console:

Expand Down Expand Up @@ -106,7 +106,7 @@ NOTE: When you are done, press kbd:[Ctrl + c].

== Example 2: Kafka Streams with Spring Boot

The Kafka Streams API reads the same topic from Confluent Cloud and does a stateful sum aggregation, also a rolling sum of the counts as it processes each record.
The Kafka Streams API reads the same topic and does a stateful sum aggregation, also a rolling sum of the counts as it processes each record.

.Run the Kafka Streams application.
[source,shell]
Expand Down
Loading

0 comments on commit 7eca640

Please sign in to comment.