Druid 0.12: No serializer found for class io.druid.client.ImmutableDruidDataSource [Formatting]

Problem

I’ve seem to have misconfigured the druid installation and cannot retrieve the full metadata for datasources.

Get request error for http://:8081/druid/coordinator/v1/datasources/wikiticker?full=true

com.fasterxml.jackson.databind.JsonMappingException: No serializer found for class io.druid.client.ImmutableDruidDataSource and no properties discovered to create BeanSerializer (to avoid exception, disable SerializationFeature.FAIL_ON_EMPTY_BEANS) )

Any pointers on where I should look to resolve this issue ?

Druid Setup

  • Druid Version:Latest stable version 0.12.0, installed using binaries.
  • System: Kerberized Hadoop Cluster with druid running on 4 nodes.
  • Loaded plugins: [“druid-histogram”,“druid-datasketches”,“druid-lookups-cached-global”,“mysql-metadata-storage”,“druid-kerberos”,“druid-hdfs-storage”,“druid-avro-extensions”,“druid-parquet-extensions”]
  • Common config

druid.extensions.loadList=[“druid-histogram”,“druid-datasketches”,“druid-lookups-cached-global”,“mysql-metadata-storage”,“druid-kerberos”,“druid-hdfs-storage”,“druid-avro-extensions”,“druid-parquet-extensions”]

druid.indexer.logs.directory=</hdfs/root/path/druid/logdir>

druid.indexer.logs.type=hdfs

druid.javascript.enabled=false

druid.metadata.storage.connector.connectURI=jdbc:mysql://:3306/druid

druid.metadata.storage.connector.password=REDACTED

druid.metadata.storage.connector.user=REDACTED

druid.metadata.storage.type=mysql

druid.sql.enable=false

druid.startup.logging.logProperties=true

druid.storage.storageDirectory=</hdfs/root/path/druid>

druid.storage.type=hdfs

druid.zk.paths.base=/druid

druid.zk.service.host=<host 1>:2181,<host 2>:2181,<host 3>:2181

druid.extensions.directory=/opt/cloudera/parcels/DRUID-0.12.0-0.0.4/extensions

druid.extensions.hadoopDependenciesDir=/opt/cloudera/parcels/DRUID-0.12.0-0.0.4/hadoop-dependencies

druid.selectors.indexing.serviceName=druid/overlord

druid.selectors.coordinator.serviceName=druid/coordinator

druid.request.logging.feed=druid_requests

druid.request.logging.type=emitter

druid.monitoring.emissionPeriod=PT1m

druid.hadoop.security.kerberos.principal=REDACTED

druid.hadoop.security.kerberos.keytab=REDACTED

  • Coordinator runtime properties
    druid.port=8081

druid.service=druid/coordinator

druid.coordinator.period=PT30S

druid.coordinator.startDelay=PT30S

  • Historical runtime properties
    druid.port=8083

druid.service=druid/historical

druid.server.http.numThreads=40

druid.processing.buffer.sizeBytes=256000000

druid.processing.numMergeBuffers=2

druid.processing.numThreads=2

druid.server.maxSize=300000000000

druid.historical.cache.useCache=true

druid.historical.cache.populateCache=true

druid.cache.type=local

druid.cache.sizeInBytes=10000000

druid.segmentCache.locations=[{“path”:"</local/path/to/druid/segments>",“maxSize”:300000000000}]

  • Overlord runtime properties
    druid.port=8090

druid.service=druid/overlord

druid.indexer.queue.startDelay=PT30S

druid.indexer.runner.type=remote

druid.indexer.storage.type=metadata

  • MiddleManager runtime properties
    druid.indexer.runner.javaOptsArray=["-server", “-Xmx3g”, “-Duser.timezone=UTC”, “-Dfile.encoding=UTF-8”, “-XX:+UseG1GC”, “-XX:MaxGCPauseMillis=100”, “-XX:+PrintGCDetails”, “-XX:+PrintGCTimeStamps”,"-Dhadoop.mapreduce.job.classloader=true"]

druid.indexer.task.defaultHadoopCoordinates=[“org.apache.hadoop:hadoop-client:2.7.3”]

druid.indexer.task.restoreTasksOnRestart=true

druid.port=8091

druid.processing.buffer.sizeBytes=0

druid.processing.numThreads=2

druid.server.http.numThreads=40

druid.worker.capacity=2

druid.service=druid/middlemanager

druid.indexer.task.baseTaskDir=/data/disk1/druid/task

Successful http request

Request:

http://:8081/druid/coordinator/v1/datasources/

Response:

[“cv_data”,“wikiticker”]

Failed http request

Request:

http://:8081/druid/coordinator/v1/datasources/wikiticker?full=true

Response:

HTTP ERROR: 500

Problem accessing /druid/coordinator/v1/datasources/wikiticker. Reason:

com.fasterxml.jackson.databind.JsonMappingException: No serializer found for class io.druid.client.ImmutableDruidDataSource and no properties discovered to create BeanSerializer (to avoid exception, disable SerializationFeature.FAIL_ON_EMPTY_BEANS) )

Stacktrace

com.fasterxml.jackson.databind.JsonMappingException: No serializer found for class io.druid.client.ImmutableDruidDataSource and no properties discovered to create BeanSerializer (to avoid exception, disable SerializationFeature.FAIL_ON_EMPTY_BEANS) )

at com.fasterxml.jackson.databind.ser.impl.UnknownSerializer.failForEmpty(UnknownSerializer.java:59) ~[jackson-databind-2.4.6.jar:2.4.6]

at com.fasterxml.jackson.databind.ser.impl.UnknownSerializer.serialize(UnknownSerializer.java:26) ~[jackson-databind-2.4.6.jar:2.4.6]

at com.fasterxml.jackson.databind.ser.DefaultSerializerProvider.serializeValue(DefaultSerializerProvider.java:128) ~[jackson-databind-2.4.6.jar:2.4.6]

at com.fasterxml.jackson.databind.ObjectWriter.writeValue(ObjectWriter.java:602) ~[jackson-databind-2.4.6.jar:2.4.6]

at com.fasterxml.jackson.jaxrs.base.ProviderBase.writeTo(ProviderBase.java:648) ~[jackson-jaxrs-base-2.4.6.jar:2.4.6]

at com.sun.jersey.spi.container.ContainerResponse.write(ContainerResponse.java:302) ~[jersey-server-1.19.3.jar:1.19.3]

at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1510) ~[jersey-server-1.19.3.jar:1.19.3]

at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1419) ~[jersey-server-1.19.3.jar:1.19.3]

at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1409) ~[jersey-server-1.19.3.jar:1.19.3]

at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409) ~[jersey-servlet-1.19.3.jar:1.19.3]

at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558) ~[jersey-servlet-1.19.3.jar:1.19.3]

at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733) ~[jersey-servlet-1.19.3.jar:1.19.3]

at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) ~[javax.servlet-api-3.1.0.jar:3.1.0]

at com.google.inject.servlet.ServletDefinition.doServiceImpl(ServletDefinition.java:286) ~[guice-servlet-4.1.0.jar:?]

at com.google.inject.servlet.ServletDefinition.doService(ServletDefinition.java:276) ~[guice-servlet-4.1.0.jar:?]

at com.google.inject.servlet.ServletDefinition.service(ServletDefinition.java:181) ~[guice-servlet-4.1.0.jar:?]

at com.google.inject.servlet.ManagedServletPipeline.service(ManagedServletPipeline.java:91) ~[guice-servlet-4.1.0.jar:?]

at com.google.inject.servlet.ManagedFilterPipeline.dispatch(ManagedFilterPipeline.java:120) ~[guice-servlet-4.1.0.jar:?]

at com.google.inject.servlet.GuiceFilter.doFilter(GuiceFilter.java:135) ~[guice-servlet-4.1.0.jar:?]

at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759) ~[jetty-servlet-9.3.19.v20170502.jar:9.3.19.v20170502]

at io.druid.server.http.RedirectFilter.doFilter(RedirectFilter.java:72) ~[druid-server-0.12.0.jar:0.12.0]

at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759) ~[jetty-servlet-9.3.19.v20170502.jar:9.3.19.v20170502]

at io.druid.server.security.PreResponseAuthorizationCheckFilter.doFilter(PreResponseAuthorizationCheckFilter.java:84) ~[druid-server-0.12.0.jar:0.12.0]

at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759) ~[jetty-servlet-9.3.19.v20170502.jar:9.3.19.v20170502]

at io.druid.server.security.AllowAllAuthenticator$1.doFilter(AllowAllAuthenticator.java:85) ~[druid-server-0.12.0.jar:0.12.0]

at io.druid.server.security.AuthenticationWrappingFilter.doFilter(AuthenticationWrappingFilter.java:60) ~[druid-server-0.12.0.jar:0.12.0]

at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759) ~[jetty-servlet-9.3.19.v20170502.jar:9.3.19.v20170502]

at io.druid.server.security.SecuritySanityCheckFilter.doFilter(SecuritySanityCheckFilter.java:86) ~[druid-server-0.12.0.jar:0.12.0]

at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759) ~[jetty-servlet-9.3.19.v20170502.jar:9.3.19.v20170502]

at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:582) ~[jetty-servlet-9.3.19.v20170502.jar:9.3.19.v20170502]

at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:224) ~[jetty-server-9.3.19.v20170502.jar:9.3.19.v20170502]

at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1180) ~[jetty-server-9.3.19.v20170502.jar:9.3.19.v20170502]

at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:512) ~[jetty-servlet-9.3.19.v20170502.jar:9.3.19.v20170502]

at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185) ~[jetty-server-9.3.19.v20170502.jar:9.3.19.v20170502]

at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1112) ~[jetty-server-9.3.19.v20170502.jar:9.3.19.v20170502]

at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) ~[jetty-server-9.3.19.v20170502.jar:9.3.19.v20170502]

at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:493) ~[jetty-server-9.3.19.v20170502.jar:9.3.19.v20170502]

at org.eclipse.jetty.server.handler.HandlerList.handle(HandlerList.java:52) ~[jetty-server-9.3.19.v20170502.jar:9.3.19.v20170502]

at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134) ~[jetty-server-9.3.19.v20170502.jar:9.3.19.v20170502]

at org.eclipse.jetty.server.Server.handle(Server.java:534) ~[jetty-server-9.3.19.v20170502.jar:9.3.19.v20170502]

at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:320) ~[jetty-server-9.3.19.v20170502.jar:9.3.19.v20170502]

at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:251) ~[jetty-server-9.3.19.v20170502.jar:9.3.19.v20170502]

at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:283) ~[jetty-io-9.3.19.v20170502.jar:9.3.19.v20170502]

at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:108) ~[jetty-io-9.3.19.v20170502.jar:9.3.19.v20170502]

at org.eclipse.jetty.io.SelectChannelEndPoint$2.run(SelectChannelEndPoint.java:93) ~[jetty-io-9.3.19.v20170502.jar:9.3.19.v20170502]

at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.executeProduceConsume(ExecuteProduceConsume.java:303) ~[jetty-util-9.3.19.v20170502.jar:9.3.19.v20170502]

at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.produceConsume(ExecuteProduceConsume.java:148) ~[jetty-util-9.3.19.v20170502.jar:9.3.19.v20170502]

at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.run(ExecuteProduceConsume.java:136) ~[jetty-util-9.3.19.v20170502.jar:9.3.19.v20170502]

at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:671) ~[jetty-util-9.3.19.v20170502.jar:9.3.19.v20170502]

at org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:589) ~[jetty-util-9.3.19.v20170502.jar:9.3.19.v20170502]

at java.lang.Thread.run(Thread.java:745) [?:1.8.0_121]

Similar error for cv_data

Additional request information

Posting the results for each endpoint of:
http://:8081/druid/coordinator/v1/datasources/wikiticker/

``

["2015-09-12T00:00:00.000Z/2015-09-13T00:00:00.000Z"]

`http://:8081/druid/coordinator/v1/datasources/wikiticker/intervals

`

{"2015-09-12T00:00:00.000Z/2015-09-13T00:00:00.000Z":{"wikiticker_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2018-04-11T14:40:59.541Z":{"metadata":{"dataSource":"wikiticker","interval":"2015-09-12T00:00:00.000Z/2015-09-13T00:00:00.000Z","version":"2018-04-11T14:40:59.541Z","loadSpec":{"type":"hdfs","path":"hdfs://<host 1>:8020/path/to/root/20150912T000000.000Z_20150913T000000.000Z/2018-04-11T14_40_59.541Z/0_index.zip"},"dimensions":"channel,cityName,comment,countryIsoCode,countryName,isAnonymous,isMinor,isNew,isRobot,isUnpatrolled,metroCode,namespace,page,regionIsoCode,regionName,user","metrics":"count,added,deleted,delta,user_unique","shardSpec":{"type":"none"},"binaryVersion":9,"size":5535121,"identifier":"wikiticker_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2018-04-11T14:40:59.541Z"},"servers":["<host 1>:8083
","<host 2>:8083"]}}}

`http://:8081/druid/coordinator/v1/datasources/wikiticker/intervals?full

`

[{"dataSource":"wikiticker","interval":"2015-09-12T00:00:00.000Z/2015-09-13T00:00:00.000Z","version":"2018-04-11T14:40:59.541Z","loadSpec":{"type":"hdfs","path":"hdfs://<host 1>:8020/path/to/root/
/segments/wikiticker/20150912T000000.000Z_20150913T000000.000Z/2018-04-11T14_40_59.541Z/0_index.zip"},"dimensions":"channel,cityName,comment,countryIsoCode,countryName,isAnonymous,isMinor,isNew,isRobot,isUnpatrolled,metroCode,namespace,page,regionIsoCode,regionName,user","metrics":"count,added,deleted,delta,user_unique","shardSpec":{"type":"none"},"binaryVersion":9,"size":5535121,"identifier":"wikiticker_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2018-04-11T14:40:59.541Z"}]

`http://:8081/druid/coordinator/v1/datasources/wikiticker/segments?=full

`

http://:8081/druid/coordinator/v1/datasources/wikiticker/tiers

``

["_default_tier"]

``

Hi, I reopened your issue and left some comments.

Jihoon

2018년 4월 18일 (수) 오전 2:56, jy.keung@gmail.com님이 작성: