Druid memory consumption

Hi,

It looks like Druid will consume the whole memory up by the time. Let’s say for example that vanilla installation will take a couple GB of RAM while on the other hand Druid will take almost the whole 64GB while it’s being fed with the data for a month. The same will happen with for example 128GB of RAM after 2 months of ingesting data.

The question is whether it’s expected behaviour? Should we limit a maximum of memory consumption (and how)?

Anything that can shad some ligth on this will be much appreciated, thanks!

Regards,
Shinesun

So, nobody has any thoughts or suggestions on this?

it depends on what you’re looking at. If you are looking at the free memory on the system, it should go to 0.

If you’re looking at the allocated memory for the JVM, it should have an upper bound.

Our typical deployment has a specified JVM heap size and specified direct memory limit, then uses page cache on top of that up to the limit of memory on the system.

Hi,

Thanks for the follow up.

That’s more or less the answer I’ve been looking for.

Regards,

Shinesun

Hi Shinesun

Can you provide more facts about how druid is using all the RAM.

Are we talking about the JVM heap ? or the Off heap ?

is this happening on Historicals or Brokers ?

Can you provide the configuration used to run the JVM and druid nodes in General.

IMO this can be just an issue of miss-configuration.