Monitoring logs with Kafka and Druid

I am trying to integrate kafka with druid for realtime metrics and monitoring(The idea is to create new datasource to monitor various metrics). I have set the configuration to emit logs to kafka. But the problem is, it is emitting too much of data > 100 MB
Below is my config for http emitter which I have placed in common config file.
Here is the error message which I get Invalid receive (size = 1347375956 larger than 104857600)

I am not sure why it is sending such a huge stream which Kafka is not able to handle.

I tried to recreate the topic in kafka and also delete all logs in druid. But I am always getting the above error.

While I have not tried this particular setup, it is entirely possible that Druid does not terminate lines in a way Kafka considers proper, hence the large post request to Kafka.

As an experiment I would get a package called “filebeat” that can read log files and ship them to Kafka (you could alternately use rsyslog to send files to Kafka if you feel like reading a lot of documentation). Convert your Druid setup to write to file, have filebeat ship it to Kafka and see if this error still occurs.