Historical: Calculate Memory for segments

I am trying to use the suggested formula for computing memory for segmentation, but it gets to negative.

Druid memory maps segments.

memory_for_segments = total_memory - heap_size - (processing.buffer.sizeBytes * (processing.numThreads+1)) - JVM overhead (~1G)

The greater the memory/disk ratio, the better performance you should see

VM size: 48 GB RAM, 8 cores cpu

Historicalprocess: -Xms6g -Xmx6g -XX:MaxDirectMemorySize=12g

buffersize:

druid.processing.buffer.sizeBytes=681574400
druid.processing.numThreads=7
druid.processing.numMergeBuffers=2

Can you please help here?

How is it going negative ?

memory_for_segments = total_memory - heap_size - (processing.buffer.sizeBytes * (processing.numThreads+1)) - JVM overhead (~1G)

As per the formula you mentioned,
memory_for_segments = 48G - 6G - ((0.63G) * 7 + 1) - 1G = 35.96G

Thanks,

Sashi