Segment to large Segment [xxxx] to large for storage [/tmp/druid/indexCache:7,404,837]

In logs of historical node I have found an exception saying segment is to large to be loaded.




Result of df command:

What option should I update so the segment can be loaded?

Looking at the configuration here:

and the logs here:

It appears the node is functioning correctly at startup and is downloading segments until it starts running out of space. Your configuration is set to hold a maximum of 10G of segments but your server.maxSize is set to 100G.

If you change

druid.segmentCache.locations=[{“path”: “/home/druid/cache/historical1/indexCache”, “maxSize”: 10000000000}]


druid.segmentCache.locations=[{“path”: “/home/druid/cache/historical1/indexCache”, “maxSize”: 100000000000}]

This should match your server.maxSize.

Do you still see the problems after doing this?

Looks like it solved the problem.

Thank you.