big datasource in small memory

Hello,
As we know druid can answer to query when the actual segments are

I consider following situation: we have not enough memory to load segments. What does druid do in this situation ?

Hi Tom,

The historical nodes load segments using memory and disk. The max size of the all segments that one historical can load is fixed by

druid.server.maxSize Regards,

Andres

Tom, you’ve asked this question a few times now. I really suggest reading more about memory mapping.