We need to load the data of size **150 GB and query it **- **1) is it possible to do this just with local storage and derby? **
we are using local storage to load the data.
- the indexing is very slow… what should we do get better speed. 2) what should be done to increase the speed of indexing. what kind tuning configs should be done to increase the speed of indexing
I’m trying to set up a cluster with the following hardware
5 nodes with 16 CPU and 64 gb RAM each
zookeeper, coordinator and overlord on the node 1.
historical and middle manager on node 2.
historical on node 3.
brokers on node 4 and 5.
3) Is this the right setup for the size of data. Please suggest if any changes required.
4) our data doesn’t actually have a timestamp. but we have generated a timestamp for testing the performance of the druid. the timestamps are random, they are not in a particular order. So will this affect the performance?