Unable to ingest data from PostgreSQL to Druid

Hi guys,
I’m trying to ingest PostgreSQL data into Druid using firehose, the task get created but unfortunately it fails in overlord and there’s no descriptive error message I’m receiving in overlord console.

Ingestion spec:

{

"type": "index",
"spec": {
    "dataSchema": {
        "dataSource": "data_set_new",
        "parser": {
            "type": "string",
            "parseSpec": {
                "format": "json",
                "columns": [
                    "intempactivefactkey",
                    " dtason",
                    "intempdimkey",
                    "intdeliveryunitdimkey",
                    "intsbudimkey"
                ],
                "timestampSpec": {
                    "column": "dtason",
                    "format": "auto"
                },
                "dimensionsSpec": {
                    "dimensions": [
                        "intempactivefactkey",
                        " dtason",
                        "intempdimkey",
                        "intdeliveryunitdimkey",
                        "intsbudimkey"
                    ]
                }
            }
        },
        "granularitySpec": {
            "type": "uniform",
            "segmentGranularity": "DAY",
            "queryGranularity": "NONE",
            "rollup": false
        }
    },
    "ioConfig": {
        "type": "index",
        "firehose": {
            "type": "sql",
            "database": {
                "type": "postgresql",
                "connectorConfig": {
                    "connectURI": "jdbc:postgresql://address:5432/DB",
                    "user": "username",
                    "password": "password"
                }
            },
            "sqls": [
                "SELECT * FROM schema.tablename LIMIT 100"
            ]
        },
        "appendToExisting": false
    },
    "tuningConfig": {
        "forceExtendableShardSpecs": true,
        "type": "index"
    }
}

}

Error log:

2019-07-09T14:43:44,037 INFO [task-runner-0-priority-0] org.apache.druid.data.input.impl.prefetch.PrefetchSqlFirehoseFactory - Create a new firehose for [1] queries
2019-07-09T14:43:44,040 INFO [firehose_fetch_0] org.apache.druid.data.input.impl.prefetch.Fetcher - Fetching [0]th object[SELECT * FROM hradm.biemdfactempactive LIMIT 100], fetchedBytes[0]
2019-07-09T14:43:44,094 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding org.apache.druid.server.initialization.jetty.CustomExceptionMapper to GuiceManagedComponentProvider with the scope "Singleton"
2019-07-09T14:43:44,097 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding org.apache.druid.server.initialization.jetty.ForbiddenExceptionMapper to GuiceManagedComponentProvider with the scope "Singleton"
2019-07-09T14:43:44,097 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding org.apache.druid.server.initialization.jetty.BadRequestExceptionMapper to GuiceManagedComponentProvider with the scope "Singleton"
2019-07-09T14:43:44,098 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding com.fasterxml.jackson.jaxrs.json.JacksonJsonProvider to GuiceManagedComponentProvider with the scope "Singleton"
2019-07-09T14:43:44,106 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding com.fasterxml.jackson.jaxrs.smile.JacksonSmileProvider to GuiceManagedComponentProvider with the scope "Singleton"
2019-07-09T14:43:44,380 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding org.apache.druid.server.http.security.StateResourceFilter to GuiceInstantiatedComponentProvider
2019-07-09T14:43:44,400 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding org.apache.druid.server.http.SegmentListerResource to GuiceManagedComponentProvider with the scope "PerRequest"
2019-07-09T14:43:44,406 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding org.apache.druid.server.QueryResource to GuiceInstantiatedComponentProvider
2019-07-09T14:43:44,411 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding org.apache.druid.segment.realtime.firehose.ChatHandlerResource to GuiceInstantiatedComponentProvider
2019-07-09T14:43:44,413 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding org.apache.druid.server.http.security.ConfigResourceFilter to GuiceInstantiatedComponentProvider
2019-07-09T14:43:44,416 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding org.apache.druid.query.lookup.LookupListeningResource to GuiceInstantiatedComponentProvider
2019-07-09T14:43:44,417 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding org.apache.druid.query.lookup.LookupIntrospectionResource to GuiceInstantiatedComponentProvider
2019-07-09T14:43:44,419 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding org.apache.druid.server.StatusResource to GuiceManagedComponentProvider with the scope "Undefined"
2019-07-09T14:43:44,435 WARN [main] com.sun.jersey.spi.inject.Errors - The following warnings have been detected with resource and/or provider classes:
  WARNING: A HTTP GET method, public void org.apache.druid.server.http.SegmentListerResource.getSegments(long,long,long,javax.servlet.http.HttpServletRequest) throws java.io.IOException, MUST return a non-void type.
2019-07-09T14:43:44,449 INFO [main] org.eclipse.jetty.server.handler.ContextHandler - Started o.e.j.s.ServletContextHandler@30704f85{/,null,AVAILABLE}
2019-07-09T14:43:44,459 INFO [main] org.eclipse.jetty.server.AbstractConnector - Started ServerConnector@2e93108a{HTTP/1.1,[http/1.1]}{0.0.0.0:8102}
2019-07-09T14:43:44,459 INFO [main] org.eclipse.jetty.server.Server - Started @5286ms
2019-07-09T14:43:44,459 INFO [main] org.apache.druid.java.util.common.lifecycle.Lifecycle - Starting lifecycle [module] stage [ANNOUNCEMENTS]
2019-07-09T14:43:44,459 INFO [main] org.apache.druid.java.util.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void org.apache.druid.curator.announcement.Announcer.start()] on object[org.apache.druid.curator.announcement.Announcer@34e25492].
2019-07-09T14:43:44,459 INFO [main] org.apache.druid.curator.announcement.Announcer - Starting announcer
2019-07-09T14:43:44,484 INFO [main] org.apache.druid.java.util.common.lifecycle.Lifecycle - Successfully started lifecycle [module]
2019-07-09T14:43:44,576 INFO [task-runner-0-priority-0] org.apache.druid.data.input.impl.prefetch.CacheManager - Object[SELECT * FROM hradm.biemdfactempactive LIMIT 100] is cached. Current cached bytes is [34952]
2019-07-09T14:43:44,587 ERROR [task-runner-0-priority-0] org.apache.druid.indexing.common.task.IndexTask - Encountered exception in DETERMINE_PARTITIONS.
java.lang.ClassCastException: java.util.LinkedHashMap cannot be cast to java.nio.ByteBuffer
	at org.apache.druid.segment.transform.TransformingStringInputRowParser.parseBatch(TransformingStringInputRowParser.java:31) ~[druid-processing-0.15.0-incubating.jar:0.15.0-incubating]
java.lang.ClassCastException: java.util.LinkedHashMap cannot be cast to java.nio.ByteBuffer
	at org.apache.druid.segment.transform.TransformingStringInputRowParser.parseBatch(TransformingStringInputRowParser.java:31) ~[druid-processing-0.15.0-incubating.jar:0.15.0-incubating]

Do I need to have same PostgreSQL server in both common.runtime.properties file (conf file) and ingestion spec?

Thanks in advance.

If you have not already done so, can you please make sure you have the ‘postgresql-metadata-storage’ extension added in the property files.?

That is already in place.

Im guessing your parser type is defined as “string” can you change it to “map” and try?

And also can you post your parser spec if that didn’t work?

Yeah, I was using wrong parser, this issue is fixed. Thanks