Hello Druid Users
I am trying to merge 2 adjacent segments in druid datasource.
And I get the following error in the overlord logs:
Druid version: 0.10.0
(On the Druid 0.9.2 I get the same error)
Exception from Overlord console:
io.druid.java.util.common.ISE: Merge is invalid: current segment(s) are not in the requested set: foo1061434376_2017-07-08T00:00:00.000Z_2017-07-09T00:00:00.000Z_2017-07-10T09:30:42.103Z, foo1061434376_2017-07-07T00:00:00.000Z_2017-07-08T00:00:00.000Z_2017-07-10T09:30:59.082Z
at io.druid.indexing.common.task.MergeTaskBase.isReady(MergeTaskBase.java:219) ~[druid-indexing-service-0.10.0-iap5.jar:0.10.0-iap5]
at io.druid.indexing.overlord.TaskQueue.manage(TaskQueue.java:246) [druid-indexing-service-0.10.0-iap5.jar:0.10.0-iap5]
at io.druid.indexing.overlord.TaskQueue.access$000(TaskQueue.java:69) [druid-indexing-service-0.10.0-iap5.jar:0.10.0-iap5]
at io.druid.indexing.overlord.TaskQueue$1.run(TaskQueue.java:136) [druid-indexing-service-0.10.0-iap5.jar:0.10.0-iap5]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_131]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_131]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_131]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_131]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_131]
Merge task json spec that was posted to overlord:
{
“type”: “merge”,
“dataSource”: “foo1061434376”,
“segments”: [
{
“dataSource”: “foo1061434376”,
“interval”: “2017-07-07T00:00:00.000Z/2017-07-08T00:00:00.000Z”,
“intervalSpec”: {
“begin”: “2017-07-07T00:00:00Z”,
“end”: “2017-07-08T00:00:00Z”
},
“dimensions”: “dim1,dim2,dim3,index”,
“metrics”: “x,date,RowsCount”,
“size”: 8597,
“identifier”: “foo1061434376_2017-07-07T00:00:00.000Z_2017-07-08T00:00:00.000Z_2017-07-10T09:30:59.082Z”,
“loadSpec”: {
“type”: “local”,
“path”: “/druid-hdd/segmentsNFS/foo1061434376/2017-07-07T00:00:00.000Z_2017-07-08T00:00:00.000Z/2017-07-10T09:30:59.082Z/0/index.zip”
},
“binaryVersion”: “9”
},
{
“dataSource”: “foo1061434376”,
“interval”: “2017-07-08T00:00:00.000Z/2017-07-09T00:00:00.000Z”,
“intervalSpec”: {
“begin”: “2017-07-08T00:00:00Z”,
“end”: “2017-07-09T00:00:00Z”
},
“dimensions”: “dim1,dim2,dim3,index”,
“metrics”: “x,date,RowsCount”,
“size”: 8596,
“identifier”: “foo1061434376_2017-07-08T00:00:00.000Z_2017-07-09T00:00:00.000Z_2017-07-10T09:30:42.103Z”,
“loadSpec”: {
“type”: “local”,
“path”: “/druid-hdd/segmentsNFS/foo1061434376/2017-07-08T00:00:00.000Z_2017-07-09T00:00:00.000Z/2017-07-10T09:30:42.103Z/0/index.zip”
},
“binaryVersion”: “9”
}
],
“aggregations”: [
{
“type”: “doubleSum”,
“name”: “x”,
“fieldName”: “x”
},
{
“type”: “longSum”,
“name”: “date”,
“fieldName”: “date”
},
{
“type”: “longSum”,
“name”: “RowsCount”,
“fieldName”: “RowsCount”
}
],
“rollup”: true
}
All segment json specs was retrived from endpoint http:///druid/coordinator/v1/datasources/?full
Does anyone know what caused this exception?
thanks in advance,
Denis Vlah