Kafka Indexing Service Failing with java.lang.IllegalArgumentException: fromIndex(0) > toIndex(-1)

Hello,

I’ve been testing druid 0.10.1 and Kafka indexing service, I wanted to run it along side our production to find if there is any discrepancies in data.

I submitted this job

curl -X POST -H ‘Content-Type: application/json’ -d '{

“type”: “kafka”,

“dataSchema”: {

“dataSource”: “anonyme”,

“parser”: {

“type”: “string”,

“parseSpec”: {

“format”: “json”,

“timestampSpec”: {

“column”: “ts”,

“format”: “auto”

},

“dimensionsSpec”: {

“dimensions”: ,

“dimensionExclusions”: ,

“spatialDimensions”:

}

}

},

“metricsSpec”: [

{

“type”: “javascript”,

“name”: “count”,

“fieldNames”: ,

“fnAggregate” : “function(current, a, b) { return 1; }”,

“fnCombine” : “function(partialA, partialB) { return 1; }”,

“fnReset” : “function() { return 1; }”

},

{

“type”: “count”,

“name”: “duplicated_count”

},

{

“type”: “doubleMax”,

“name”: “winPrice”,

“fieldName”: “winPrice”

},

{

“type”: “doubleMax”,

“name”: “usdPrice”,

“fieldName”: “usdPrice”

},

{

“type”: “doubleMax”,

“name”: “winPriceCV”,

“fieldName”: “winPriceCV”

},

{

“type”: “doubleMax”,

“name”: “usdPriceCV”,

“fieldName”: “usdPriceCV”

}

],

“granularitySpec”: {

“type”: “uniform”,

“segmentGranularity”: “FIVE_MINUTE”,

“queryGranularity”: “FIVE_MINUTE”

}

},

“tuningConfig”: {

“type”: “kafka”,

“maxRowsPerSegment”: 5000000

},

“ioConfig”: {

“topic”: “events”,

“consumerProperties”: {

“bootstrap.servers”: “10.89.16.37:9092,10.89.16.38:9092,10.89.16.39:9092”

},

“taskCount”: 1,

“replicas”: 1,

“taskDuration”: “PT10M”

}

}’ http://10.89.16.35:8090/druid/indexer/v1/supervisor

``

and I got this exception :

java.lang.IllegalArgumentException: fromIndex(0) > toIndex(-1)
	at java.util.ArrayList.subListRangeCheck(ArrayList.java:1006) ~[?:1.8.0_141]
	at java.util.ArrayList.subList(ArrayList.java:996) ~[?:1.8.0_141]
	at io.druid.segment.realtime.appenderator.AppenderatorImpl.persist(AppenderatorImpl.java:361) ~[druid-server-0.10.1.jar:0.10.1]
	at io.druid.segment.realtime.appenderator.AppenderatorImpl.persistAll(AppenderatorImpl.java:442) ~[druid-server-0.10.1.jar:0.10.1]
	at io.druid.segment.realtime.appenderator.AppenderatorDriver.persist(AppenderatorDriver.java:247) ~[druid-server-0.10.1.jar:0.10.1]
	at io.druid.indexing.kafka.KafkaIndexTask.run(KafkaIndexTask.java:498) ~[?:?]
	at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:436) [druid-indexing-service-0.10.1.jar:0.10.1]
	at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:408) [druid-indexing-service-0.10.1.jar:0.10.1]
	at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_141]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_141]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_141]
	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_141]

``

Could anyone help ?