Segment to large Segment [xxxx] to large for storage [/tmp/druid/indexCache:7,404,837]

In logs of historical node I have found an exception saying segment is to large to be loaded.

Eception: http://pastebin.com/qaHJnsYj

Log: https://drive.google.com/file/d/0BybfIWUUOznPSFc4Ni1IQmRxeTg/view?usp=sharing

Historical runtime.properties: http://pastebin.com/pGzu9erB

Result of df command: http://pastebin.com/b00Ek5gs

What option should I update so the segment can be loaded?

Looking at the configuration here:

and the logs here:

It appears the node is functioning correctly at startup and is downloading segments until it starts running out of space. Your configuration is set to hold a maximum of 10G of segments but your server.maxSize is set to 100G.

If you change

druid.segmentCache.locations=[{“path”: “/home/druid/cache/historical1/indexCache”, “maxSize”: 10000000000}]

to

druid.segmentCache.locations=[{“path”: “/home/druid/cache/historical1/indexCache”, “maxSize”: 100000000000}]

This should match your server.maxSize.

Do you still see the problems after doing this?

Looks like it solved the problem.

Thank you.