Streaming data from kafka post-transformations to druid


We have streaming data coming in from kafka,

I want to do some amount of transformation

before pushing data into druid.

I see that tranquility is integrated with kafka for direct

consumption, otherwise it seems to have integration

with storm/spark sort of setup. Wondering if there is

something similar for kafka-connect?

Also, my transformation is pretty simple, wondering if

there is a simpler way to go forward with here (maybe

slightly off-topic for this group, just thought would check once)?

One of things I was considering was using kafka-connect

with http interface of tranquility, anything to be careful w.r.t

to this push based approach?

(Sorry if this seems a bit naive, I am bit new to this stream

processing and analytics world.)



My suggestion would be to do the required pre processing of the data from the source topic to another topic (lets say target topic). You can use Spark/Flink/Storm or any processing frameworks for this. And the use the Kafka Indexing service on the target topic to ingest to Druid.