Issue running kafka indexing service

Hi,

on my first attempt to use the new kafka indexing service, the tasks logs show the following exception and I’m currently out of ideas what might be the issue here

2016-07-05T15:57:28,562 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[interface io.druid.client.cache.CacheProvider] from props[druid.cache.] as [io.druid.client.cache.LocalCacheProvider@77114efe]
2016-07-05T15:57:28,573 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.client.cache.CacheConfig] from props[druid.realtime.cache.] as [io.druid.client.cache.CacheConfig@326d27ac]
2016-07-05 15:57:28,584 main ERROR An exception occurred processing Appender Console com.google.inject.internal.guava.collect.$ComputationException: java.lang.IllegalArgumentException
at com.google.inject.internal.guava.collect.$ComputingConcurrentHashMap$ComputingMapAdapter.get(ComputingConcurrentHashMap.java:397)
at com.google.inject.internal.util.StackTraceElements.forMember(StackTraceElements.java:56)
at com.google.inject.internal.Errors.formatSource(Errors.java:703)
at com.google.inject.internal.Errors.format(Errors.java:568)
at com.google.inject.ProvisionException.getMessage(ProvisionException.java:61)
at org.apache.logging.log4j.core.impl.ThrowableProxy.(ThrowableProxy.java:131)
at org.apache.logging.log4j.core.impl.ThrowableProxy.(ThrowableProxy.java:117)
at org.apache.logging.log4j.core.impl.Log4jLogEvent.getThrownProxy(Log4jLogEvent.java:482)
at org.apache.logging.log4j.core.pattern.ExtendedThrowablePatternConverter.format(ExtendedThrowablePatternConverter.java:64)
at org.apache.logging.log4j.core.pattern.PatternFormatter.format(PatternFormatter.java:36)
at org.apache.logging.log4j.core.layout.PatternLayout$PatternSerializer.toSerializable(PatternLayout.java:292)
at org.apache.logging.log4j.core.layout.PatternLayout.toSerializable(PatternLayout.java:206)
at org.apache.logging.log4j.core.layout.PatternLayout.toSerializable(PatternLayout.java:56)
at org.apache.logging.log4j.core.layout.AbstractStringLayout.toByteArray(AbstractStringLayout.java:148)
at org.apache.logging.log4j.core.appender.AbstractOutputStreamAppender.append(AbstractOutputStreamAppender.java:112)
at org.apache.logging.log4j.core.config.AppenderControl.tryCallAppender(AppenderControl.java:152)
at org.apache.logging.log4j.core.config.AppenderControl.callAppender0(AppenderControl.java:125)
at org.apache.logging.log4j.core.config.AppenderControl.callAppenderPreventRecursion(AppenderControl.java:116)
at org.apache.logging.log4j.core.config.AppenderControl.callAppender(AppenderControl.java:84)
at org.apache.logging.log4j.core.config.LoggerConfig.callAppenders(LoggerConfig.java:390)
at org.apache.logging.log4j.core.config.LoggerConfig.processLogEvent(LoggerConfig.java:378)
at org.apache.logging.log4j.core.config.LoggerConfig.log(LoggerConfig.java:362)
at org.apache.logging.log4j.core.config.LoggerConfig.log(LoggerConfig.java:352)
at org.apache.logging.log4j.core.config.AwaitCompletionReliabilityStrategy.log(AwaitCompletionReliabilityStrategy.java:63)
at org.apache.logging.log4j.core.Logger.logMessage(Logger.java:147)
at org.apache.logging.log4j.spi.AbstractLogger.logMessage(AbstractLogger.java:1011)
at org.apache.logging.log4j.spi.AbstractLogger.logIfEnabled(AbstractLogger.java:988)
at org.apache.logging.slf4j.Log4jLogger.error(Log4jLogger.java:318)
at com.metamx.common.logger.Logger.error(Logger.java:97)
at io.druid.cli.GuiceRunnable.initLifecycle(GuiceRunnable.java:94)
at io.druid.cli.CliPeon.run(CliPeon.java:274)
at io.druid.cli.Main.main(Main.java:105)
Caused by: java.lang.IllegalArgumentException
at com.google.inject.internal.asm.$ClassReader.(Unknown Source)
at com.google.inject.internal.asm.$ClassReader.(Unknown Source)
at com.google.inject.internal.asm.$ClassReader.(Unknown Source)
at com.google.inject.internal.util.LineNumbers.(LineNumbers.java:65)
at com.google.inject.internal.util.StackTraceElements$1.apply(StackTraceElements.java:39)
at com.google.inject.internal.util.StackTraceElements$1.apply(StackTraceElements.java:36)
at com.google.inject.internal.guava.collect.$ComputingConcurrentHashMap$ComputingValueReference.compute(ComputingConcurrentHashMap.java:355)
at com.google.inject.internal.guava.collect.$ComputingConcurrentHashMap$ComputingSegment.compute(ComputingConcurrentHashMap.java:184)
at com.google.inject.internal.guava.collect.$ComputingConcurrentHashMap$ComputingSegment.getOrCompute(ComputingConcurrentHashMap.java:153)
at com.google.inject.internal.guava.collect.$ComputingConcurrentHashMap.getOrCompute(ComputingConcurrentHashMap.java:69)
at com.google.inject.internal.guava.collect.$ComputingConcurrentHashMap$ComputingMapAdapter.get(ComputingConcurrentHashMap.java:393)
… 31 more

Has somebody had an exception like this and would know what is causing it?
thanks
Sascha

Hey Sascha,

It looks like something is failing when Guice is starting up, but the actual failure is being masked by another Guice failure to generate the appropriate error message which is why the stack trace looks so obscure. There’s a similar issue mentioned here which is happening in the same version of Guice we’re using (4.0 beta).

It’s likely that Guice is failing because of a bad configuration-based binding somewhere. If you post your common.runtime.properties and runtime.properties for the middle manager, we can see if there’s any hints in there as to what is going on. Alternatively (or additionally), it might be worthwhile trying out the the Kafka indexing service tutorial (http://imply.io/docs/latest/tutorial-kafka-indexing-service.html) which has fairly conservative configurations which might help to rule out other potential causes.

If both of those options don’t pan out, there’s an open PR to update to Guice 4.1 which will hopefully fix the Guice error so the real exception is visible: https://github.com/druid-io/druid/pull/3222. Last resort might be generating a build from master with Guice 4.1 and see if that helps clarify what’s going on.

Hi,

The Guice update to 4.1.0 helped to see the origin exception:

Caused by: java.lang.NoSuchMethodError: net.jpountz.util.Utils.checkRange([BII)V
	at org.apache.kafka.common.record.KafkaLZ4BlockInputStream.read(KafkaLZ4BlockInputStream.java:177)

Currently, the druid-kafka-indexing-service depends on kafka-clients 0.9.0.1 that was built against lz4 1.2.0, but Druid uses lz4 1.3.0. The maven build has kafka-clients-0.9.0.1.jar and lz4-1.3.0.jar in the druid-kafka-indexing-service folder. It’s strange why no one had this issue so far.

I’ll try to update kafka-clients to 0.10.0.0 and provide you results.

Best regards,

Roman

I filed an issue ticket for fixing this: #3266

As a temporal workaround, I use Snappy instead of LZ4.