what could be the best tuning of druid for getting result within less than a second, if we querying 4 years of DATA(20gb) on one year timstamp?
Performance is very relative. You’ll have to make sure that the cluster is properly sized for what you are doing with the type of queries you will be running and for how many concurrent queries you are targeting.
Here are the things that you need to consider that affects performance.
- Size of segment should be within 500MB - 700MB
of Historical cores in the cluster
- Available memory to map segments in Historical memory
of processing threads
- Size of buffers in bytes
20GB of data is pretty small so a small size cluster should be able to handle that with decent query perf.
These questions, like I am sure you understand, require a bit on context and your setup. It would help others help you better if you run through what your set up is and the things you have tried so people can recommend things that helps you quicker.
I’ve 3 historical nodes each has 4cores and 32gb ram and 1Tb HD.
What should be my tuning parameters on all historical nodes e.g thread, buffers Heapsize, Directmemory and etc.?