Ideal Production Setup for 1200 per second of ingestion and a total amount of data of 3 billion


I wanted to use Druid in my production environment with a maximum ingestion of 5000 events per second and total data set of 3 billion. I am currently using storm for data ingestion with tranquility and a basic production setup with 16gb memory and 8 core machine.

It would be great if I can get an idea of the ideal production sizing specifically for druid in order to avoid drop outs of message and no data loss.


Anjna Bhati