NoSuchMethodError reading from Kafka with lz4 compression

Hi all,

Has anyone has success with RealTime nodes reading Kafka data that’s lz4 compressed? I get an error that this method “net.jpountz.util.Utils.checkRange” is not found.

Suggestions appreciated. I’m using Druid 0.7.1.1 with Kafka 8 extension.

2015-05-05T19:09:24,694 ERROR [chief-svc_perf] io.druid.segment.realtime.RealtimeManager - Exception aborted realtime processing[svc_perf]: {class=io.druid.segment.realtime.RealtimeManager, exceptionType=class java.lang.NoSuchMethodError, exceptionMessage=net.jpountz.util.Utils.checkRange([BII)V}

java.lang.NoSuchMethodError: net.jpountz.util.Utils.checkRange([BII)V

at org.apache.kafka.common.message.KafkaLZ4BlockInputStream.read(KafkaLZ4BlockInputStream.java:176) ~[?:?]

at java.io.FilterInputStream.read(FilterInputStream.java:107) ~[?:1.7.0_76]

at kafka.message.ByteBufferMessageSet$$anonfun$decompress$1.apply$mcI$sp(ByteBufferMessageSet.scala:67) ~[?:?]

at kafka.message.ByteBufferMessageSet$$anonfun$decompress$1.apply(ByteBufferMessageSet.scala:67) ~[?:?]

at kafka.message.ByteBufferMessageSet$$anonfun$decompress$1.apply(ByteBufferMessageSet.scala:67) ~[?:?]

at scala.collection.immutable.Stream$.continually(Stream.scala:1129) ~[?:?]

at kafka.message.ByteBufferMessageSet$.decompress(ByteBufferMessageSet.scala:67) ~[?:?]

at kafka.message.ByteBufferMessageSet$$anon$1.makeNextOuter(ByteBufferMessageSet.scala:179) ~[?:?]

at kafka.message.ByteBufferMessageSet$$anon$1.makeNext(ByteBufferMessageSet.scala:192) ~[?:?]

at kafka.message.ByteBufferMessageSet$$anon$1.makeNext(ByteBufferMessageSet.scala:146) ~[?:?]

at kafka.utils.IteratorTemplate.maybeComputeNext(IteratorTemplate.scala:66) ~[?:?]

at kafka.utils.IteratorTemplate.hasNext(IteratorTemplate.scala:58) ~[?:?]

at kafka.utils.IteratorTemplate.next(IteratorTemplate.scala:38) ~[?:?]

at kafka.consumer.ConsumerIterator.makeNext(ConsumerIterator.scala:94) ~[?:?]

at kafka.consumer.ConsumerIterator.makeNext(ConsumerIterator.scala:33) ~[?:?]

at kafka.utils.IteratorTemplate.maybeComputeNext(IteratorTemplate.scala:66) ~[?:?]

at kafka.utils.IteratorTemplate.hasNext(IteratorTemplate.scala:58) ~[?:?]

at io.druid.firehose.kafka.KafkaEightFirehoseFactory$1.hasMore(KafkaEightFirehoseFactory.java:106) ~[?:?]

at io.druid.segment.realtime.RealtimeManager$FireChief.run(RealtimeManager.java:235) [druid-server-0.7.1.1.jar:0.7.1.1]

Thanks,

Roger

This seems like a net.jpountz.lz4:lz4 version conflict between Kafka and Druid. It looks like that method got moved from Utils to UnsafeUtils in net.jpountz.lz4:lz4 1.3.0 (Kafka is built against 1.2.0, Druid is built against 1.3.0). You might be able to get around this by building a version of Kafka against net.jpountz.lz4:lz4 1.3.0. The Kafka folks should hopefully be open to a patch making that the default.

Note that the kafka LZ4 output stream and input stream currently use LZ4 Frames 1.4.1
but I’m not sure if it is using that particular output stream for your use case.

Druid has no LZ4 frames support as of yet (only LZ4 blocks). So I’m not convinced it will work correctly once you do get the version dependency stuff resolved. It would be very helpful if you could report back if you are seeing any data corruption due to frames vs blocks, or if all that is handled transparently.

I have a 1.5.0 patch in to jpountz to help put frames support there officially, but I don’t know if it will get accepted.

If you’re interested in 1.5.0 LZ4 frames support for kafka I have a local patch that I have no good way of testing :frowning: I’m mostly concerned what happens during a rolling upgrade.

Thanks, Gian.

Thanks, Charles.

The kafka-eight extension includes the 0.8.1.1 Kafka consumer so it should support the LZ4 frames as well. The main issue seems to be a conflict between the lz4 library version that comes with Druid itself and the version dragged in by the Kafka consumer.

Cheers,

Roger

@Roger

Great to hear! Thanks for reporting back