Out of memory error when ingesting from SQL

Trying to ingest from a Postgresql database with ~3000000 line, I get this error:

2020-08-14T17:27:18,917 INFO [main] org.eclipse.jetty.server.handler.ContextHandler - Started o.e.j.s.ServletContextHandler@65ad2b42{/,null,AVAILABLE}
2020-08-14T17:27:18,943 INFO [main] org.eclipse.jetty.server.AbstractConnector - Started ServerConnector@8e0ab7e{HTTP/1.1,[http/1.1]}{0.0.0.0:8100}
2020-08-14T17:27:18,944 INFO [main] org.eclipse.jetty.server.Server - Started @13746ms
2020-08-14T17:27:18,944 INFO [main] org.apache.druid.java.util.common.lifecycle.Lifecycle - Starting lifecycle [module] stage [ANNOUNCEMENTS]
2020-08-14T17:27:18,945 INFO [main] org.apache.druid.java.util.common.lifecycle.Lifecycle - Successfully started lifecycle [module]
Terminating due to java.lang.OutOfMemoryError: GC overhead limit exceeded

Here is my ingestion spec:

{
“type” : “index_parallel”,
“spec” : {
“dataSchema”: {
“dataSource”: “test”,
“granularitySpec” : {
“type” : “uniform”,
“segmentGranularity” : “DAY”,
“queryGranularity” : “HOUR”,
“intervals” : [“2020-04-01/2020-08-01”],
“rollup” : true
},
“parser”: {
“parseSpec”: {
“format”: “json”,
“timestampSpec” : {
“format” : “iso”,
“column” : “timestamp”
},
“dimensionsSpec”: {
“dimensions”: [
“col1”,
“col2”
]
},
“metricsSpec” : [
{ “type” : “doubleSum”, “name” : “quantitySum”, “fieldName” : “col3” }
]
}
}

},
“ioConfig”: {
“type”: “index_parallel”,
“firehose”: {
“type”: “sql”,
“database”: {
“type”: “postgresql”,
“connectorConfig”: {
“connectURI”: “jdbc:postgresql://localhost:5432/”,
“user”: “user”,
“password”: “password”
}
},
“sqls”: [
“SELECT col1, col2, col3, timestamp FROM table WHERE timestamp BETWEEN ‘2020-04-01 00:00:00’ AND ‘2020-08-01 11:59:59’”
]
}
},
“tuningConfig”: {
“type”: “index_parallel”,
“maxRowsPerSegment” : 5000000
}
}
}

I am running the micro-quickstart script to start Druid 0.18.1 on Mac OSX.

Maybe reduce the amount of data to a much smaller amount, then work your way back up?