You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
My shop is looking at using this as the base for a golang stream processor. We have existing SPs which implement our message format / conventions, one of which is that we use kafka headers for message metadata (correlation_id, timestamp, ser/des details, etc).
I see that the provided KafkaConsumer passes on the received message's body to the next node, but does not pass headers.
Do you have any guidance on the best way to accomplish this?
One option seems to be to copy KafkaConsumer and KafkaProducer and modify them to do what we want. This seems inelegant, since there would be a ton of functionality in those that we would not be touching, but would be accepting a maintenance burden for.
Or maybe (hopefully?) I've missed something and there is some header functionality already available?
Thanks,
Dan
The text was updated successfully, but these errors were encountered:
Unfortunately the interface is just []byte today so you're correct, headers are not passed. In better news, I've got a plan that involves changing some of this as a part of upcoming work to adopt generics and eliminate the need for type assertions.
That change set will be a major version / api incompatible, and it is a good opportunity to fix the Source interface to use a struct that separates metadata (like headers) from the message payload.
That may not help much if you're starting this in the short term. I don't have a timeline I can commit to for that work unfortunately.
The producer is easier to address, because adding fields to firebolt.ProduceRequest can be done without breaking API compatibility. Again tho, without the source changes it may not be enough to meet your needs.
As you said it leaves you in a spot where if you have to start in the short term you have only inelegant options. Hopefully they'd be temporary, and we could get you back on the paved path in the future.
Hello,
Nice work on this project - it looks really cool!
My shop is looking at using this as the base for a golang stream processor. We have existing SPs which implement our message format / conventions, one of which is that we use kafka headers for message metadata (correlation_id, timestamp, ser/des details, etc).
I see that the provided
KafkaConsumer
passes on the received message's body to the next node, but does not pass headers.Similarly, the provided
KafkaProducer
does not provide a way to assign/write headers (that I can see).Do you have any guidance on the best way to accomplish this?
One option seems to be to copy
KafkaConsumer
andKafkaProducer
and modify them to do what we want. This seems inelegant, since there would be a ton of functionality in those that we would not be touching, but would be accepting a maintenance burden for.Or maybe (hopefully?) I've missed something and there is some header functionality already available?
Thanks,
Dan
The text was updated successfully, but these errors were encountered: