Druid consumed message from Kafka but in Query server found nothing

Hi everyone,

I am using imply-1.3.0 just for 3 days btw I am a newbie.

In the query server I setup a kafka, create a topic name pageviews and start tranquility kafka.

As you can see from the logs tranquility server was consuming the data from kafka.

But Why it doesn’t show anything in Query server?

2016-08-13 09:12:11,342 [KafkaConsumer-1] INFO o.h.validator.internal.util.Version - HV000001: Hibernate Validator 5.1.3.Final

2016-08-13 09:12:12,053 [KafkaConsumer-1] INFO io.druid.guice.JsonConfigurator - Loaded class[class io.druid.guice.ExtensionsConfig] from props[druid.extensions.] as [ExtensionsConfig{searchCurrentClassloader=true, directory=‘extensions’, hadoopDependenciesDir=‘hadoop-dependencies’, hadoopContainerDruidClasspath=‘null’, loadList=null}]

2016-08-13 09:12:12,776 [KafkaConsumer-1] INFO c.metamx.emitter.core.LoggingEmitter - Start: started [true]

2016-08-13 09:12:13,435 [KafkaConsumer-1] INFO c.m.t.finagle.FinagleRegistry - Adding resolver for scheme[druidTask!druid:overlord].

2016-08-13 09:12:22,555 [KafkaConsumer-CommitThread] INFO c.m.tranquility.kafka.KafkaConsumer - Flushed {pageviews={receivedCount=33, sentCount=0, droppedCount=33, unparseableCount=0}} pending messages in 2ms and committed offsets in 8ms.

2016-08-13 09:12:37,562 [KafkaConsumer-CommitThread] INFO c.m.tranquility.kafka.KafkaConsumer - Flushed {pageviews={receivedCount=3, sentCount=0, droppedCount=3, unparseableCount=0}} pending messages in 0ms and committed offsets in 5ms.

2016-08-13 09:12:52,569 [KafkaConsumer-CommitThread] INFO c.m.tranquility.kafka.KafkaConsumer - Flushed {pageviews={receivedCount=3, sentCount=0, droppedCount=3, unparseableCount=0}} pending messages in 0ms and committed offsets in 7ms.

``

kafka.json

{
  "dataSources": {
    "pageviews-kafka": {
      "spec": {
        "dataSchema": {
          "dataSource": "pageviews-kafka",
          "parser": {
            "type": "string",
            "parseSpec": {
              "timestampSpec": {
                "column": "time",
                "format": "auto"
              },
              "dimensionsSpec": {
                "dimensions": [
                  "url",
                  "user"
                ]
              },
              "format": "json"
            }
          },
          "granularitySpec": {
            "type": "uniform",
            "segmentGranularity": "hour",
            "queryGranularity": "none"
          },
          "metricsSpec": [
            {
              "name": "views",
              "type": "count"
            },
            {
              "name": "latencyMs",
              "type": "doubleSum",
              "fieldName": "latencyMs"
            }
          ]
        },
        "ioConfig": {
          "type": "realtime"
        },
        "tuningConfig": {
          "type": "realtime",
          "maxRowsInMemory": "100000",
          "intermediatePersistPeriod": "PT10M",
          "windowPeriod": "PT10M"
        }
      },
      "properties": {
        "task.partitions": "1",
        "task.replicants": "1",
        "topicPattern": "pageviews"
      }
    }
  },
  "properties": {
    "zookeeper.connect": "master1:2181,master2:2181,master3:2181",
    "druid.discovery.curator.path": "/druid/discovery",
    "druid.selectors.indexing.serviceName": "druid/overlord",
    "commit.periodMillis": "15000",
    "consumer.numThreads": "2",
    "kafka.zookeeper.connect": "master3:2181",
    "kafka.group.id": "tranquility-kafka",
    "serialization.format": "smile",
    "druidBeam.taskLocator": "overlord"
  }
}

``

Hey Chanh,

The events are all being dropped by Tranquility, most likely because of a timestamp issue (either the timestamps are too old or the wrong column is being referenced in the timestampSpec). Make sure that your events are recent, since Tranquility only works with realtime data.

I got it David,

but I after I change the kafka producer generator to meet the realtime.

It has something else.

2016-08-13 11:54:14,815 [KafkaConsumer-CommitThread] INFO c.m.tranquility.kafka.KafkaConsumer - Flushed 0 pending messages in 0ms and committed offsets in 0ms.

2016-08-13 11:54:20,218 [finagle/netty3-17] WARN c.m.tranquility.beam.ClusteredBeam - Emitting alert: [anomaly] Failed to propagate events: druid:overlord/pageviews-kafka

{

“eventCount” : 1,

“timestamp” : “2016-08-13T11:00:00.000Z”,

“beams” : “MergingPartitioningBeam(DruidBeam(interval = 2016-08-13T11:00:00.000Z/2016-08-13T12:00:00.000Z, partition = 0, tasks = [index_realtime_pageviews-kafka_2016-08-13T11:00:00.000Z_0_0/pageviews-kafka-011-0000-0000]))”

}

com.twitter.finagle.ChannelWriteException: java.net.ConnectException: Connection refused: master3/10.197.0.5:8102 from service: druidTask!druid:overlord!index_realtime_pageviews-kafka_2016-08-13T11:00:00.000Z_0_0

   	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[na:1.8.0_91]

   	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) ~[na:1.8.0_91]

   	at org.jboss.netty.channel.socket.nio.NioClientBoss.connect(NioClientBoss.java:152) [io.netty.netty-3.10.5.Final.jar:na]

   	at org.jboss.netty.channel.socket.nio.NioClientBoss.processSelectedKeys(NioClientBoss.java:105) [io.netty.netty-3.10.5.Final.jar:na]

   	at org.jboss.netty.channel.socket.nio.NioClientBoss.process(NioClientBoss.java:79) [io.netty.netty-3.10.5.Final.jar:na]

   	at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:337) [io.netty.netty-3.10.5.Final.jar:na]

   	at org.jboss.netty.channel.socket.nio.NioClientBoss.run(NioClientBoss.java:42) [io.netty.netty-3.10.5.Final.jar:na]

   	at org.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108) [io.netty.netty-3.10.5.Final.jar:na]

   	at org.jboss.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42) [io.netty.netty-3.10.5.Final.jar:na]

   	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_91]

   	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_91]

   	at java.lang.Thread.run(Thread.java:745) [na:1.8.0_91]

Caused by: java.net.ConnectException: Connection refused: master3/10.197.0.5:8102

   	... 12 common frames omitted

2016-08-13 11:54:20,224 [finagle/netty3-17] INFO c.metamx.emitter.core.LoggingEmitter - Event [{“feed”:“alerts”,“timestamp”:“2016-08-13T11:54:20.223Z”,“service”:“tranquility”,“host”:“localhost”,“severity”:“anomaly”,“description”:“Failed to propagate events: druid:overlord/pageviews-kafka”,“data”:{“exceptionType”:“com.twitter.finagle.ChannelWriteException”,“exceptionStackTrace”:“com.twitter.finagle.ChannelWriteException: java.net.ConnectException: Connection refused: master3/10.197.0.5:8102 from service: druidTask!druid:overlord!index_realtime_pageviews-kafka_2016-08-13T11:00:00.000Z_0_0\n\tat com.twitter.finagle.NoStacktrace(Unknown Source)\nCaused by: java.net.ConnectException: Connection refused: master3/10.197.0.5:8102\n\tat sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)\n\tat sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)\n\tat org.jboss.netty.channel.socket.nio.NioClientBoss.connect(NioClientBoss.java:152)\n\tat org.jboss.netty.channel.socket.nio.NioClientBoss.processSelectedKeys(NioClientBoss.java:105)\n\tat org.jboss.netty.channel.socket.nio.NioClientBoss.process(NioClientBoss.java:79)\n\tat org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:337)\n\tat org.jboss.netty.channel.socket.nio.NioClientBoss.run(NioClientBoss.java:42)\n\tat org.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108)\n\tat org.jboss.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42)\n\tat java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)\n\tat java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)\n\tat java.lang.Thread.run(Thread.java:745)\n”,“timestamp”:“2016-08-13T11:00:00.000Z”,“beams”:“MergingPartitioningBeam(DruidBeam(interval = 2016-08-13T11:00:00.000Z/2016-08-13T12:00:00.000Z, partition = 0, tasks = [index_realtime_pageviews-kafka_2016-08-13T11:00:00.000Z_0_0/pageviews-kafka-011-0000-0000]))”,“eventCount”:1,“exceptionMessage”:“java.net.ConnectException: Connection refused: master3/10.197.0.5:8102 from service: druidTask!druid:overlord!index_realtime_pageviews-kafka_2016-08-13T11:00:00.000Z_0_0”}}]

2016-08-13 11:54:27,852 [finagle/netty3-17] WARN c.m.tranquility.beam.ClusteredBeam - Emitting alert: [anomaly] Failed to propagate events: druid:overlord/pageviews-kafka

{

“eventCount” : 1,

“timestamp” : “2016-08-13T11:00:00.000Z”,

“beams” : “MergingPartitioningBeam(DruidBeam(interval = 2016-08-13T11:00:00.000Z/2016-08-13T12:00:00.000Z, partition = 0, tasks = [index_realtime_pageviews-kafka_2016-08-13T11:00:00.000Z_0_0/pageviews-kafka-011-0000-0000]))”

}

com.twitter.finagle.ChannelWriteException: java.net.ConnectException: Connection refused: master3/10.197.0.5:8102 from service: druidTask!druid:overlord!index_realtime_pageviews-kafka_2016-08-13T11:00:00.000Z_0_0

   	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[na:1.8.0_91]

   	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) ~[na:1.8.0_91]

   	at org.jboss.netty.channel.socket.nio.NioClientBoss.connect(NioClientBoss.java:152) [io.netty.netty-3.10.5.Final.jar:na]

   	at org.jboss.netty.channel.socket.nio.NioClientBoss.processSelectedKeys(NioClientBoss.java:105) [io.netty.netty-3.10.5.Final.jar:na]

   	at org.jboss.netty.channel.socket.nio.NioClientBoss.process(NioClientBoss.java:79) [io.netty.netty-3.10.5.Final.jar:na]

   	at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:337) [io.netty.netty-3.10.5.Final.jar:na]

   	at org.jboss.netty.channel.socket.nio.NioClientBoss.run(NioClientBoss.java:42) [io.netty.netty-3.10.5.Final.jar:na]

   	at org.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108) [io.netty.netty-3.10.5.Final.jar:na]

   	at org.jboss.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42) [io.netty.netty-3.10.5.Final.jar:na]

   	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_91]

   	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_91]

   	at java.lang.Thread.run(Thread.java:745) [na:1.8.0_91]

Caused by: java.net.ConnectException: Connection refused: master3/10.197.0.5:8102

   	... 12 common frames omitted

Hey Chanh,

Do you have any firewalls / port restrictions in place? The Tranquility process needs to be able to talk to the overlord and the peon processes (in IAP the master and data nodes).