Batch Ingestion error for Big input file(> 5gb)

I am trying to ingest a big file into Druid (> 5GB). Druid is throwing out of memory error.
What are the runtime requirements to ingest files > 5 Gb? How can we handle this.
middlemanager jvm.config is as shown below
-server
-Xms8g
-Xmx8g
-Duser.timezone=UTC
-Dfile.encoding=UTF-8
-Djava.io.tmpdir=var/tmp
-Djava.util.logging.manager=org.apache.logging.log4j.jul.LogManager
Also the runtime.properties for middlemanager is set as below

druid.service=druid/middleManager
druid.port=8091

Number of tasks per middleManager

druid.worker.capacity=3

Task launch parameters

druid.indexer.runner.javaOpts=-server -Xmx4g -Duser.timezone=UTC -Dfile.encoding=UTF-8 -Djava.util.logging.manager=org.apache.logging.log4j.jul.LogManager
druid.indexer.task.baseTaskDir=var/druid/task

HTTP server threads

druid.server.http.numThreads=9

Processing threads and buffers

druid.processing.buffer.sizeBytes=256000000
druid.processing.numThreads=2

Hadoop indexing

druid.indexer.task.hadoopWorkingPath=var/druid/hadoop-tmp
druid.indexer.task.defaultHadoopCoordinates=[“org.apache.hadoop:hadoop-client:2.3.0”]

I know we are missing some configuration. Any input here would be appreciated?