Warnings with tranquility client

I have a component that reads data from rabbitmq and pushes it to Druid via tranquility client. Whenever “beamPacketizer.send” method is executed I get some warnings in log about a transient error, but eventually the events are indexed after several retries. Following is the stacktrace:

2015-07-31 04:08:20.157 WARN 53585 — [inagle/netty3-1] c.m.tranquility.finagle.FutureRetry$ : Transient error, will try again in 6332 ms
java.io.IOException: Unable to push events to task: index_realtime_weatherbot_2015-07-31T04:07:00.000-04:00_0_0 (status = TaskRunning)
at com.metamx.tranquility.druid.DruidBeam$$anonfun$4$$anonfun$apply$4$$anonfun$apply$6$$anonfun$apply$7$$anonfun$apply$3$$anonfun$applyOrElse$2.apply(DruidBeam.scala:160)
at com.metamx.tranquility.druid.DruidBeam$$anonfun$4$$anonfun$apply$4$$anonfun$apply$6$$anonfun$apply$7$$anonfun$apply$3$$anonfun$applyOrElse$2.apply(DruidBeam.scala:146)
at com.twitter.util.Future$$anonfun$map$1$$anonfun$apply$6.apply(Future.scala:863)
at com.twitter.util.Try$.apply(Try.scala:13)
at com.twitter.util.Future$.apply(Future.scala:90)
at com.twitter.util.Future$$anonfun$map$1.apply(Future.scala:863)
at com.twitter.util.Future$$anonfun$map$1.apply(Future.scala:863)
at com.twitter.util.Future$$anonfun$flatMap$1.apply(Future.scala:824)
at com.twitter.util.Future$$anonfun$flatMap$1.apply(Future.scala:823)
at com.twitter.util.Promise$Transformer.liftedTree1$1(Promise.scala:100)
at com.twitter.util.Promise$Transformer.k(Promise.scala:100)
at com.twitter.util.Promise$Transformer.apply(Promise.scala:110)
at com.twitter.util.Promise$Transformer.apply(Promise.scala:91)
at com.twitter.util.Promise$$anon$2.run(Promise.scala:345)
at com.twitter.concurrent.LocalScheduler$Activation.run(Scheduler.scala:186)
at com.twitter.concurrent.LocalScheduler$Activation.submit(Scheduler.scala:157)
at com.twitter.concurrent.LocalScheduler.submit(Scheduler.scala:212)
at com.twitter.concurrent.Scheduler$.submit(Scheduler.scala:86)
at com.twitter.util.Promise.runq(Promise.scala:331)
at com.twitter.util.Promise.updateIfEmpty(Promise.scala:642)
at com.twitter.util.Promise.update(Promise.scala:615)
at com.twitter.util.Promise.setValue(Promise.scala:591)
at com.twitter.concurrent.AsyncQueue.offer(AsyncQueue.scala:76)
at com.twitter.finagle.transport.ChannelTransport.handleUpstream(ChannelTransport.scala:45)
at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)
at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791)
at org.jboss.netty.handler.codec.http.HttpContentDecoder.messageReceived(HttpContentDecoder.java:108)
at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70)
at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)
at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791)
at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:296)
at org.jboss.netty.handler.codec.http.HttpChunkAggregator.messageReceived(HttpChunkAggregator.java:194)
at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70)
at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)
at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791)
at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:296)
at org.jboss.netty.handler.codec.frame.FrameDecoder.unfoldAndFireMessageReceived(FrameDecoder.java:459)
at org.jboss.netty.handler.codec.replay.ReplayingDecoder.callDecode(ReplayingDecoder.java:536)
at org.jboss.netty.handler.codec.replay.ReplayingDecoder.messageReceived(ReplayingDecoder.java:435)
at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70)
at org.jboss.netty.handler.codec.http.HttpClientCodec.handleUpstream(HttpClientCodec.java:92)
at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)
at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791)
at org.jboss.netty.channel.SimpleChannelHandler.messageReceived(SimpleChannelHandler.java:142)
at com.twitter.finagle.channel.ChannelStatsHandler.messageReceived(ChannelStatsHandler.scala:86)
at org.jboss.netty.channel.SimpleChannelHandler.handleUpstream(SimpleChannelHandler.java:88)
at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)
at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791)
at org.jboss.netty.channel.SimpleChannelHandler.messageReceived(SimpleChannelHandler.java:142)
at com.twitter.finagle.channel.ChannelRequestStatsHandler.messageReceived(ChannelRequestStatsHandler.scala:35)
at org.jboss.netty.channel.SimpleChannelHandler.handleUpstream(SimpleChannelHandler.java:88)
at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)
at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559)
at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:268)
at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:255)
at org.jboss.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88)
at org.jboss.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:108)
at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:337)
at org.jboss.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89)
at org.jboss.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178)
at org.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108)
at org.jboss.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: com.twitter.finagle.NoBrokersAvailableException: No hosts are available for druid:firehose:weatherbot-07-0000-0000
at com.twitter.finagle.NoStacktrace(Unknown Source)

``

Any idea what can be the cause of these warnings ?

Thanks,

Ravish

Hey Ravish,

That means tranquility can’t find the task “index_realtime_weatherbot_2015-07-31T04:07:00.000-04:00_0_0”. It’s looking for it in service discovery with the key “druid:firehose:weatherbot-07-0000-0000”. Is that task actually running? You could tell in the overlord web console- if it’s assigned to a worker and has a non-empty log then it’s actually running. If it’s running, is it always available or is it periodically unavailable due to something like long GC pauses?

Hi Gian,
I am not creating any realtime index tasks before hand, the “index_realtime_weatherbot_2015-07-31T04:07:00.000-04:00_0_0” is created (and is in running state) as soon as the event is published from tranquility.

Once a taks is created I don’t see this warning on subsequent event push.

Thanks,

Ravish

Ok- so do I have this right? What happens is:

  1. You send some data with tranquility

  2. It emits errors for a while about not finding your realtime task

  3. Then the errors go away and tranquility works fine

  4. Possibly at the next segmentGranularity boundary, this happens again

This could be caused by your tasks taking a longer than normal amount of time to start up. How long does it usually take for the errors to resolve? Can you tell if your indexing service is full on capacity during this time?

This is exactly what is happening. I am running all this in a local setup. The task appears immediately under “Running Tasks” section of indexer console, but it takes 5 seconds for the logs to print this last message:

2015-07-31T17:56:01,979 INFO [task-runner-0] io.druid.curator.discovery.CuratorServiceAnnouncer - Announcing service[DruidNode{serviceName='weatherbot-17-0000-0000', host='192.168.0.102', port=8100}]