Hi Guys

what could be the best tuning of druid for getting result within less than a second, if we querying 4 years of DATA(20gb) on one year timstamp?

Umar,

Performance is very relative. You’ll have to make sure that the cluster is properly sized for what you are doing with the type of queries you will be running and for how many concurrent queries you are targeting.

Here are the things that you need to consider that affects performance.

  1. Size of segment should be within 500MB - 700MB
  2. of QPS

  3. of Historical cores in the cluster

  4. Available memory to map segments in Historical memory
  5. of processing threads

  6. Size of buffers in bytes
  7. of numMergeBuffers

  8. DirectMemorySize

20GB of data is pretty small so a small size cluster should be able to handle that with decent query perf.

Rommel Garcia

Umar,

These questions, like I am sure you understand, require a bit on context and your setup. It would help others help you better if you run through what your set up is and the things you have tried so people can recommend things that helps you quicker.

I’ve 3 historical nodes each has 4cores and 32gb ram and 1Tb HD.

What should be my tuning parameters on all historical nodes e.g thread, buffers Heapsize, Directmemory and etc.?