Is druid.server.maxSize set based on disk or memory?

Does max server size is set based on disk size or memory?

And also why do we have segment cache max size?

Thanks,

Shilpa S

Does max server size is set based on disk size or memory?

both:

  • disk: you need to set a value bigger than your S3 data to be able to load it into historical node (having multiple nodes helps)

  • memory: as already explained, if the loaded data can fit into memory, the query will run faster - else, the data is only mapped in memory and you can expect some I/O reads

And also why do we have segment cache max size?

Not sure what you want to achieve - are you searching something similar to http://druid.io/docs/latest/querying/caching.html ?