Skip to content

Commit

Permalink
Merge pull request confluentinc#219 from gianlucanatali/ibm-demo
Browse files Browse the repository at this point in the history
IBM Demo
  • Loading branch information
rmoff authored Jul 26, 2021
2 parents 93b0676 + 4abf1ef commit 558a4c9
Show file tree
Hide file tree
Showing 28 changed files with 336 additions and 140 deletions.
3 changes: 2 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -55,6 +55,7 @@ You may well need to allocate Docker 8GB when running these. Avoid allocating al
- [Multi-node ksqlDB and Kafka Connect clusters](multi-cluster-connect-and-ksql)
- [Streaming ETL pipeline from MongoDB to Snowflake with Apache Kafka®](streaming-etl-mongodb-snowflake)
- [Bridge to Cloud (and back!) with Confluent and MongoDB Atlas](mongodb-demo)
- [Confluent + IBM Demo](ibm-demo) Read data from IBM MQ and IBM DB2, join with ksqlDB, sink to IBM MQ

### Kafka Connect

Expand All @@ -78,7 +79,7 @@ You may well need to allocate Docker 8GB when running these. Avoid allocating al
- [MQTT Connect Connector Demo](mqtt-connect-connector-demo)
- [Example Kafka Connect syslog configuration and Docker Compose](syslog) (see blog series [1](https://www.confluent.io/blog/real-time-syslog-processing-apache-kafka-ksql-part-1-filtering/?utm_campaign=rmoff&utm_source=demo-scene)/[2](https://www.confluent.io/blog/real-time-syslog-processing-with-apache-kafka-and-ksql-part-2-event-driven-alerting-with-slack/?utm_campaign=rmoff&utm_source=demo-scene)/[3](https://www.confluent.io/blog/real-time-syslog-processing-apache-kafka-ksql-enriching-events-with-external-data/?utm_campaign=rmoff&utm_source=demo-scene) and standalone articles [here](https://rmoff.net/2019/12/20/analysing-network-behaviour-with-ksqldb-and-mongodb/?utm_campaign=rmoff&utm_source=demo-scene) and [here](https://rmoff.net/2019/12/18/detecting-and-analysing-ssh-attacks-with-ksqldb/?utm_campaign=rmoff&utm_source=demo-scene))
- [Azure SQL Data Warehouse Connector Sink Demo](azure-sqldw-sink-connector)
- [IBM MQ Connect Connector Demo](cp-all-in-one-ibmmq)
- [Confluent + IBM Demo](ibm-demo) Read data from IBM MQ and IBM DB2, join with ksqlDB, sink to IBM MQ
- [Solace Sink/Source Demo](solace)

### Confluent Cloud
Expand Down
1 change: 0 additions & 1 deletion cp-all-in-one-ibmmq/.gitignore

This file was deleted.

92 changes: 0 additions & 92 deletions cp-all-in-one-ibmmq/README.md

This file was deleted.

Binary file removed cp-all-in-one-ibmmq/images/addmessage.png
Binary file not shown.
Binary file removed cp-all-in-one-ibmmq/images/addmessage2.png
Binary file not shown.
Binary file removed cp-all-in-one-ibmmq/images/clickstream-schema.png
Binary file not shown.
Binary file not shown.
Binary file removed cp-all-in-one-ibmmq/images/ibmmq-queues.png
Binary file not shown.
Binary file removed cp-all-in-one-ibmmq/images/ibmmq-schema.png
Binary file not shown.
Binary file removed cp-all-in-one-ibmmq/images/join.png
Binary file not shown.
2 changes: 2 additions & 0 deletions ibm-demo/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
mqlibs/*
connect-jars/*
37 changes: 30 additions & 7 deletions cp-all-in-one-ibmmq/Makefile → ibm-demo/Makefile
Original file line number Diff line number Diff line change
@@ -1,38 +1,61 @@
export CONFLUENT_DOCKER_TAG = 6.1.1

build:
docker-compose build

cluster: ibmjars
cluster: connectjars
docker-compose up -d

ps:
docker-compose ps

ibmjars:
docker-compose up -d ibmmq
-docker exec -it -e LICENSE=accept -e MQ_QMGR_NAME=MQ1 ibmmq cp -Ra /opt/mqm/java/lib/ /project/mqlibs
#-docker exec -it -e LICENSE=accept -e MQ_QMGR_NAME=MQ1 ibmmq cp -Ra /opt/mqm/java/lib/ /project/mqlibs
docker cp ibmmq:/opt/mqm/java/lib/ mqlibs
docker-compose up -d ibmdb2
docker cp ibmdb2:/opt/ibm/db2/V11.5/java/db2jcc4.jar connect-jars/db2jcc4.jar


topic:
docker exec -it connect kafka-topics --bootstrap-server broker:29092 --create --topic ibmmq --partitions 1 --replication-factor 1
docker exec -it connect kafka-topics --bootstrap-server broker:29092 --create --topic clickstream --partitions 1 --replication-factor 1

connectjars: ibmjars
cp mqlibs/lib/jms.jar connect-jars/jms.jar
cp mqlibs/lib/com.ibm.mq.allclient.jar connect-jars/com.ibm.mq.allclient.jar

connectsource: connectmqsource connectdatagen

connect:
docker exec -it connect curl -d "@/ibmmq/ibmmq-connect.json" \

connectmqsource:
docker exec -it connect curl -d "@/ibmmq/ibmmq-source.json" \
-X PUT \
-H "Content-Type: application/json" \
http://connect:8083/connectors/ibmmq-source/config

connectdb2source:
docker exec -it connect curl -d "@/ibmmq/ibmdb2-source.json" \
-X PUT \
-H "Content-Type: application/json" \
http://connect:8083/connectors/ibmdb2-source/config

connectdatagen:
docker exec -it connect curl -d "@/clickstream/clickstream-connector.json" \
-X PUT \
-H "Content-Type: application/json" \
http://connect:8083/connectors/clickstream/config


docker exec -it connect confluent-hub install --no-prompt confluentinc/kafka-connect-tibco-source:1.0.0-preview
docker exec -it connect confluent-hub install --no-prompt confluentinc/kafka-connect-tibco-sink:1.1.1-preview
connectsink:
docker exec -it connect curl -d "@/ibmmq/ibmmq-sink.json" \
-X PUT \
-H "Content-Type: application/json" \
http://connect:8083/connectors/ibmmq-sink/config

down:
docker-compose down
-rm -rf mqlibs/*
-rm -rf connect-jars/*

consumer:
docker exec -it connect kafka-avro-console-consumer --bootstrap-server broker:29092 --topic ibmmq --from-beginning \
Expand Down
190 changes: 190 additions & 0 deletions ibm-demo/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,190 @@
![image](images/architecture.png)

# Confluent + IBM Demo

This repository demonstrates how to integrate with IBM technologies (IBM MQ and DB2). Two connectors will be started up: Datagen source, to mock clickstream data and IBM MQ Connetor source. Then we'll use KSQL to join the two sources together. We'll also configure a IBM DB2 source connector to read data from DB2. The resut of the ksqlDB join will be sent to IBM MQ using a sink connector.

## Download the demo
You can download the demo [here](https://bit.ly/3ex1tLx)

Unzip the ibm-demo.zip and cd into the directory from your terminal.

## Make commands

This step will spin up the Confluent Platform cluster and the IBM DB2 and IBM MQ servers.

```bash
make build
make cluster
# wait a minute for cluster to spinup
```

## Make the topics

With these commands we create the topics we need

```bash
make topic
```

## Open the IBM MQ Dashboard

[log in](https://localhost:9443/ibmmq/console/login.html)

```conf
UserName=admin
Password=passw0rd
```

## Show AVRO schema in C3 topics

You need to send a message to IBM MQ before the schema will appear in the topic in C3.

- Select `DEV.QUEUE.1` under "Queues on MQ1"

![ibmmq](images/ibmmq-queues.png)

- Add a message

![add image](images/addmessage.png)
![add image](images/addmessage2.png)

Notice that the messages are not consumed yet...

## Access Confluent Control Center
Access [Confluent Control Center](http://localhost:9021)
Here you can see your local Confluent cluster, and the topics created before.

## Make the source connectors

Now we configure the connector so we can read data from IBM MQ

```bash
make connectsource
# wait a minute before moving on to the next step
```

- You can now see the schema assigned to the `ibmmq` topic

![ibmmq topic](images/ibmmq-schema.png)

## AVRO message appear in consumer

Run the ibmmq consumer to see messages coming in from `DEV.QUEUE.1` (or check in C3)

```bash
make consumer
```

You can also see in IBM MQ that the messages are not there anymore.


## KSQL

### Create the stream from the CLICKSTREAM topic with ksqlDB

In [Confluent Control Center](http://localhost:9021) , Select the cluster tile, Click on ksqlDB on the left menu , and select the ksqldb1 cluster.

Using the editor run the queries below:

```sql
CREATE STREAM CLICKSTREAM
WITH (KAFKA_TOPIC='clickstream',
VALUE_FORMAT='AVRO');
```

## Add another message to DEV.QUEUE.1

Send another message to IBM MQ. You can use the user names `bobk_43` or `akatz1022` to capture clickstreams for those users with a KSQL join.

## Create the Stream for the IBMMQ topic


```sql
CREATE STREAM ibmmq
WITH (KAFKA_TOPIC='ibmmq',
VALUE_FORMAT='AVRO');
```
Click on Add query properties and select auto.offset.reset = Earliest

```sql
SELECT * FROM ibmmq
EMIT CHANGES;
```

```sql
SELECT "TEXT" FROM ibmmq
EMIT CHANGES;
```

## JOIN the 2 streams

Paste the KSQL statement into the KSQL Editor to perform the join.

```sql
CREATE STREAM VIP_USERS AS
select * from CLICKSTREAM
join IBMMQ WITHIN 5 seconds
on text = username emit changes;
```

![join](images/join.png)

```sql
SELECT * FROM VIP_USERS
emit changes;
```

This query will return you values only if you added messages in IBM MQ that will match usernames in the CLICKSTREAM stream/topic (as instructed above).

## Configure DB2

```bash
docker exec -ti ibmdb2 bash -c "su - db2inst1"
```

```bash
db2 connect to sample user db2inst1 using passw0rd
```

```bash
db2 LIST TABLES
```

You can now exit db2

```bash
exit
```

Now you can create the connector to load the data from db2

```bash
make connectdb2source
```

You will see that the connector automatically creates data in Confluent. Check in [Confluent Control Center](http://localhost:9021) , under topics.

You can also see the connectors created by clicking on the Connect link in the left menu.

## Sink data to IBM MQ

Let's sink the new stream data into IBM MQ into `DEV.QUEUE.2`

```bash
make connectsink
```

You can see the data by [loggin in](https://localhost:9443/ibmmq/console/login.html)

```conf
UserName=admin
Password=passw0rd
```

## Bring down the demo
When you are done with the demo execute the command:

```conf
make down
```
File renamed without changes.
Loading

0 comments on commit 558a4c9

Please sign in to comment.