Event Driven Model for Image Processing using Spring Cloud Stream and Kafka . The project also covers K8 specific deployment files for the microservices.
The processor-service , worker-service and storage-service are springboot microservices. In between, there is a Kafka brokers. When concurrent binary data is being sent to processor service , it would scale to meet the needs and send the job message to the Kafka topic . Based on the number of messages , worker service will start scaling . Storage-service is meant to be a service which can connect to any cloud storage .
- User sends the binary data in the request body to processor-service.
- Processor-service calls storage-service to upload the binary-data.
- Processor-service create a Job id in the Job entity table and update the status to Running.
- Processor-service decodes the JWT token , and create a Job Message to send the JWT token info and storage url to Kafka Producer Topic (data_stream).
- Worker-service consumes the Job Message from the consumer topic (data_stream)
- Worker-service pulls the data from the storage-service and processes the data and uploads the data back to the storage-service
- Worker-service sends the message as Success or Failed to the producer topic(status_stream)
- Processor-service consumes the message from the consumer topic (status_stream)
- Processor-service retrieves the record for that jobid and update the database entity with respective status.
- docker installed . It should be enought to test the end to end application .
- for running in Kubernetes cluster , install minikube.
-
Git clone repo https://github.com/jokumar/image-processor.git
-
Go Inside the build-files folder and run the below commands:
-
Command to build all the spring boot applications and create images. ./build.sh.
-
Command to start the docker-compose file so as to start the kafka broker, mysql db and all 3 microservices ./start-server.sh
-
To stop all the services . Run : ./stop-server
Testing the application :
Spring boot microservice which exposes endpoint to take the image binary and validate the incoming message
The API specification can be found in the ptc-file-processor project (openapi3-application.yaml). Copy the yaml to a swagger editor . There is validation for md5 and the binary data in place There is basic validation for the JWT token in place . (format and encoding)
- Create a Job
URL : http://localhost:8080/ptc-file-processor/job/image MethodType: POST Header : Authorization eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6IkpvaG4gRG9lIiwiaWF0IjoxNTE2MjM5MDIyLCJ0aWQiOjEsIm9pZCI6MSwiYXVkIjoiY29tLmNvbXBhbnkuam9ic2VydmljZSIsImF6cCI6IjEiLCJlbWFpbCI6ImN1c3RvbWVyQG1haWwuY29tIn0.CcTapGbWX0UEMovUwC8kAcWMUxmbOeO0qhsu-wqHQH0
Request : '''
{ "encoding": "base64", "md5": "123456", "content": “”
} '''
Response: '''
{ "jobId": "2", "payloadLocation": "http://ptc-storage:8081/api/v1/blob/file6544", "payloadSize": "254048" } ''' 2. Get Job Details
URL: http://localhost:8080/ptc-file-processor/job/{jobId}
MethodType: GET
Response:
''' { "id": "2", "tenantId": "1", "clientId": "1", "payloadLocation": "http://ptc-storage:8081/api/v1/blob/file6544", "payloadSize": "12345", "status": "SUCCESS" } '''
- Get Job Status :
URL: http://localhost:8080/ptc-file-processor/job/1/status MethodType: GET
Response: ''' { "status": "SUCCESS" } '''