BIG file loading in druid

we put 8GB csv file in to druid to create data-source. after around 50 minutes it was failed. because error log said “java.lang.OutOfMemoryError: GC overhead limit exceeded”. Please help us to solve this.
This file have 280 columns.

are you using hadoop index task or local index task ?
Fwiw, for ingesting large files, using hadoop index task is recommended, try that if you are not already using it.