Tutorial wikipedia batch loading failed

I have setup druid using the dockerfile and docker-compose in the repo directory path incubator-druid/distribution/docker and was able to build the image and then run it with docker-compose up. After the druid services are running locally on my single node ubuntu machine I tried running the wikipedia batch load tutorial from the druid documentation and the task immediately failed every time I run the task.

I am running the following call:

curl -X 'POST' -H 'Content-Type:application/json' -d @quickstart/tutorial/wikipedia-index.json http://localhost:8090/druid/indexer/v1/task

And I changed the docker-compose services ports to match the same ports as the docker ports, so my overlord on my localhost is running on 8090.

Here is the output of the overlord gui console:

index_wikipedia_2019-04-02T02:43:17.912Z
index
2019-04-02T02:43:17.923Z
1970-01-01T00:00:00.000Z
FAILED
FAILED
NONE
3356
null
-1
-1
wikipedia
null

And here is the docker logs overload | grep FAIL:

2019-04-02T05:34:07,970 INFO [Curator-PathChildrenCache-1] org.apache.druid.indexing.overlord.RemoteTaskRunner - Worker[172.20.0.6:8091] wrote FAILED status for task [index_wikipedia_2019-04-02T05:34:04.235Z] on [TaskLocation{host=‘172.20.0.6’, port=8100, tlsPort=-1}]
2019-04-02T05:34:07,970 INFO [Curator-PathChildrenCache-1] org.apache.druid.indexing.overlord.RemoteTaskRunner - Worker[172.20.0.6:8091] completed task[index_wikipedia_2019-04-02T05:34:04.235Z] with status[FAILED]
2019-04-02T05:34:07,971 INFO [Curator-PathChildrenCache-1] org.apache.druid.indexing.overlord.TaskQueue - Received FAILED status for task: index_wikipedia_2019-04-02T05:34:04.235Z
2019-04-02T05:34:07,995 INFO [Curator-PathChildrenCache-1] org.apache.druid.indexing.overlord.MetadataTaskStorage - Updating task index_wikipedia_2019-04-02T05:34:04.235Z to status: TaskStatus{id=index_wikipedia_2019-04-02T05:34:04.235Z, status=FAILED, duration=3474, errorMsg=null}
2019-04-02T05:34:07,998 ERROR [Curator-PathChildrenCache-1] org.apache.druid.java.util.emitter.core.LoggingEmitter - Event [{“feed”:“metrics”,“timestamp”:“2019-04-02T05:34:07.998Z”,“service”:“druid/overlord”,“host”:“172.20.0.4:8090”,“version”:“0.15.0-incubating-SNAPSHOT”,“metric”:“task/run/time”,“value”:3474,“dataSource”:“wikipedia”,“taskId”:“index_wikipedia_2019-04-02T05:34:04.235Z”,“taskStatus”:“FAILED”,“taskType”:“index”}]
2019-04-02T05:34:07,998 INFO [Curator-PathChildrenCache-1] org.apache.druid.indexing.overlord.TaskQueue - Task FAILED: AbstractTask{id=‘index_wikipedia_2019-04-02T05:34:04.235Z’, groupId=‘index_wikipedia_2019-04-02T05:34:04.235Z’, taskResource=TaskResource{availabilityGroup=‘index_wikipedia_2019-04-02T05:34:04.235Z’, requiredCapacity=1}, dataSource=‘wikipedia’, context={}} (3474 run duration)
2019-04-02T05:34:08,000 INFO [Curator-PathChildrenCache-1] org.apache.druid.indexing.overlord.TaskRunnerUtils - Task [index_wikipedia_2019-04-02T05:34:04.235Z] status changed to [FAILED].
2019-04-02T05:40:11,573 INFO [Curator-PathChildrenCache-1] org.apache.druid.indexing.overlord.RemoteTaskRunner - Worker[172.20.0.6:8091] wrote FAILED status for task [index_wikipedia_2019-04-02T05:40:08.259Z] on [TaskLocation{host=‘172.20.0.6’, port=8100, tlsPort=-1}]
2019-04-02T05:40:11,573 INFO [Curator-PathChildrenCache-1] org.apache.druid.indexing.overlord.RemoteTaskRunner - Worker[172.20.0.6:8091] completed task[index_wikipedia_2019-04-02T05:40:08.259Z] with status[FAILED]
2019-04-02T05:40:11,573 INFO [Curator-PathChildrenCache-1] org.apache.druid.indexing.overlord.TaskQueue - Received FAILED status for task: index_wikipedia_2019-04-02T05:40:08.259Z
2019-04-02T05:40:11,584 INFO [Curator-PathChildrenCache-1] org.apache.druid.indexing.overlord.MetadataTaskStorage - Updating task index_wikipedia_2019-04-02T05:40:08.259Z to status: TaskStatus{id=index_wikipedia_2019-04-02T05:40:08.259Z, status=FAILED, duration=3285, errorMsg=null}
2019-04-02T05:40:11,588 ERROR [Curator-PathChildrenCache-1] org.apache.druid.java.util.emitter.core.LoggingEmitter - Event [{“feed”:“metrics”,“timestamp”:“2019-04-02T05:40:11.588Z”,“service”:“druid/overlord”,“host”:“172.20.0.4:8090”,“version”:“0.15.0-incubating-SNAPSHOT”,“metric”:“task/run/time”,“value”:3285,“dataSource”:“wikipedia”,“taskId”:“index_wikipedia_2019-04-02T05:40:08.259Z”,“taskStatus”:“FAILED”,“taskType”:“index”}]
2019-04-02T05:40:11,588 INFO [Curator-PathChildrenCache-1] org.apache.druid.indexing.overlord.TaskQueue - Task FAILED: AbstractTask{id=‘index_wikipedia_2019-04-02T05:40:08.259Z’, groupId=‘index_wikipedia_2019-04-02T05:40:08.259Z’, taskResource=TaskResource{availabilityGroup=‘index_wikipedia_2019-04-02T05:40:08.259Z’, requiredCapacity=1}, dataSource=‘wikipedia’, context={}} (3285 run duration)
2019-04-02T05:40:11,588 INFO [Curator-PathChildrenCache-1] org.apache.druid.indexing.overlord.TaskRunnerUtils - Task [index_wikipedia_2019-04-02T05:40:08.259Z] status changed to [FAILED].
2019-04-02T05:50:40,948 INFO [Curator-PathChildrenCache-1] org.apache.druid.indexing.overlord.RemoteTaskRunner - Worker[172.20.0.6:8091] wrote FAILED status for task [index_wikipedia_2019-04-02T05:50:37.645Z] on [TaskLocation{host=‘172.20.0.6’, port=8100, tlsPort=-1}]
2019-04-02T05:50:40,948 INFO [Curator-PathChildrenCache-1] org.apache.druid.indexing.overlord.RemoteTaskRunner - Worker[172.20.0.6:8091] completed task[index_wikipedia_2019-04-02T05:50:37.645Z] with status[FAILED]
2019-04-02T05:50:40,949 INFO [Curator-PathChildrenCache-1] org.apache.druid.indexing.overlord.TaskQueue - Received FAILED status for task: index_wikipedia_2019-04-02T05:50:37.645Z
2019-04-02T05:50:40,959 INFO [Curator-PathChildrenCache-1] org.apache.druid.indexing.overlord.MetadataTaskStorage - Updating task index_wikipedia_2019-04-02T05:50:37.645Z to status: TaskStatus{id=index_wikipedia_2019-04-02T05:50:37.645Z, status=FAILED, duration=3277, errorMsg=null}
2019-04-02T05:50:40,961 ERROR [Curator-PathChildrenCache-1] org.apache.druid.java.util.emitter.core.LoggingEmitter - Event [{“feed”:“metrics”,“timestamp”:“2019-04-02T05:50:40.961Z”,“service”:“druid/overlord”,“host”:“172.20.0.4:8090”,“version”:“0.15.0-incubating-SNAPSHOT”,“metric”:“task/run/time”,“value”:3277,“dataSource”:“wikipedia”,“taskId”:“index_wikipedia_2019-04-02T05:50:37.645Z”,“taskStatus”:“FAILED”,“taskType”:“index”}]
2019-04-02T05:50:40,961 INFO [Curator-PathChildrenCache-1] org.apache.druid.indexing.overlord.TaskQueue - Task FAILED: AbstractTask{id=‘index_wikipedia_2019-04-02T05:50:37.645Z’, groupId=‘index_wikipedia_2019-04-02T05:50:37.645Z’, taskResource=TaskResource{availabilityGroup=‘index_wikipedia_2019-04-02T05:50:37.645Z’, requiredCapacity=1}, dataSource=‘wikipedia’, context={}} (3277 run duration)
2019-04-02T05:50:40,961 INFO [Curator-PathChildrenCache-1] org.apache.druid.indexing.overlord.TaskRunnerUtils - Task [index_wikipedia_2019-04-02T05:50:37.645Z] status changed to [FAILED].
2019-04-02T06:01:05,315 INFO [Curator-PathChildrenCache-1] org.apache.druid.indexing.overlord.RemoteTaskRunner - Worker[172.20.0.6:8091] wrote FAILED status for task [index_wikipedia_2019-04-02T06:01:01.993Z] on [TaskLocation{host=‘172.20.0.6’, port=8100, tlsPort=-1}]
2019-04-02T06:01:05,315 INFO [Curator-PathChildrenCache-1] org.apache.druid.indexing.overlord.RemoteTaskRunner - Worker[172.20.0.6:8091] completed task[index_wikipedia_2019-04-02T06:01:01.993Z] with status[FAILED]
2019-04-02T06:01:05,316 INFO [Curator-PathChildrenCache-1] org.apache.druid.indexing.overlord.TaskQueue - Received FAILED status for task: index_wikipedia_2019-04-02T06:01:01.993Z
2019-04-02T06:01:05,327 INFO [Curator-PathChildrenCache-1] org.apache.druid.indexing.overlord.MetadataTaskStorage - Updating task index_wikipedia_2019-04-02T06:01:01.993Z to status: TaskStatus{id=index_wikipedia_2019-04-02T06:01:01.993Z, status=FAILED, duration=3291, errorMsg=null}
2019-04-02T06:01:05,329 ERROR [Curator-PathChildrenCache-1] org.apache.druid.java.util.emitter.core.LoggingEmitter - Event [{“feed”:“metrics”,“timestamp”:“2019-04-02T06:01:05.329Z”,“service”:“druid/overlord”,“host”:“172.20.0.4:8090”,“version”:“0.15.0-incubating-SNAPSHOT”,“metric”:“task/run/time”,“value”:3291,“dataSource”:“wikipedia”,“taskId”:“index_wikipedia_2019-04-02T06:01:01.993Z”,“taskStatus”:“FAILED”,“taskType”:“index”}]
2019-04-02T06:01:05,329 INFO [Curator-PathChildrenCache-1] org.apache.druid.indexing.overlord.TaskQueue - Task FAILED: AbstractTask{id=‘index_wikipedia_2019-04-02T06:01:01.993Z’, groupId=‘index_wikipedia_2019-04-02T06:01:01.993Z’, taskResource=TaskResource{availabilityGroup=‘index_wikipedia_2019-04-02T06:01:01.993Z’, requiredCapacity=1}, dataSource=‘wikipedia’, context={}} (3291 run duration)
2019-04-02T06:01:05,329 INFO [Curator-PathChildrenCache-1] org.apache.druid.indexing.overlord.TaskRunnerUtils - Task [index_wikipedia_2019-04-02T06:01:01.993Z] status changed to [FAILED].
2019-04-02T06:18:10,552 INFO [Curator-PathChildrenCache-1] org.apache.druid.indexing.overlord.RemoteTaskRunner - Worker[172.20.0.6:8091] wrote FAILED status for task [index_wikipedia_2019-04-02T06:18:07.137Z] on [TaskLocation{host=‘172.20.0.6’, port=8100, tlsPort=-1}]
2019-04-02T06:18:10,552 INFO [Curator-PathChildrenCache-1] org.apache.druid.indexing.overlord.RemoteTaskRunner - Worker[172.20.0.6:8091] completed task[index_wikipedia_2019-04-02T06:18:07.137Z] with status[FAILED]
2019-04-02T06:18:10,552 INFO [Curator-PathChildrenCache-1] org.apache.druid.indexing.overlord.TaskQueue - Received FAILED status for task: index_wikipedia_2019-04-02T06:18:07.137Z
2019-04-02T06:18:10,563 INFO [Curator-PathChildrenCache-1] org.apache.druid.indexing.overlord.MetadataTaskStorage - Updating task index_wikipedia_2019-04-02T06:18:07.137Z to status: TaskStatus{id=index_wikipedia_2019-04-02T06:18:07.137Z, status=FAILED, duration=3387, errorMsg=null}
2019-04-02T06:18:10,565 ERROR [Curator-PathChildrenCache-1] org.apache.druid.java.util.emitter.core.LoggingEmitter - Event [{“feed”:“metrics”,“timestamp”:“2019-04-02T06:18:10.565Z”,“service”:“druid/overlord”,“host”:“172.20.0.4:8090”,“version”:“0.15.0-incubating-SNAPSHOT”,“metric”:“task/run/time”,“value”:3387,“dataSource”:“wikipedia”,“taskId”:“index_wikipedia_2019-04-02T06:18:07.137Z”,“taskStatus”:“FAILED”,“taskType”:“index”}]
2019-04-02T06:18:10,565 INFO [Curator-PathChildrenCache-1] org.apache.druid.indexing.overlord.TaskQueue - Task FAILED: AbstractTask{id=‘index_wikipedia_2019-04-02T06:18:07.137Z’, groupId=‘index_wikipedia_2019-04-02T06:18:07.137Z’, taskResource=TaskResource{availabilityGroup=‘index_wikipedia_2019-04-02T06:18:07.137Z’, requiredCapacity=1}, dataSource=‘wikipedia’, context={}} (3387 run duration)
2019-04-02T06:18:10,565 INFO [Curator-PathChildrenCache-1] org.apache.druid.indexing.overlord.TaskRunnerUtils - Task [index_wikipedia_2019-04-02T06:18:07.137Z] status changed to [FAILED].

And here is the docker logs middlemanager | grep FAIL:

2019-04-02T05:34:07,958 INFO [forking-task-runner-0] org.apache.druid.indexing.overlord.TaskRunnerUtils - Task [index_wikipedia_2019-04-02T05:34:04.235Z] status changed to [FAILED].
2019-04-02T05:34:07,968 INFO [WorkerTaskManager-NoticeHandler] org.apache.druid.indexing.worker.WorkerTaskManager - Job’s finished. Completed [index_wikipedia_2019-04-02T05:34:04.235Z] with status [FAILED]
2019-04-02T05:34:55,113 INFO [WorkerTaskManager-CompletedTasksCleaner] org.apache.druid.indexing.worker.WorkerTaskManager - Deleting completed task[index_wikipedia_2019-04-02T05:34:04.235Z] information, overlord task status[FAILED].
2019-04-02T05:40:11,567 INFO [forking-task-runner-1] org.apache.druid.indexing.overlord.TaskRunnerUtils - Task [index_wikipedia_2019-04-02T05:40:08.259Z] status changed to [FAILED].
2019-04-02T05:40:11,571 INFO [WorkerTaskManager-NoticeHandler] org.apache.druid.indexing.worker.WorkerTaskManager - Job’s finished. Completed [index_wikipedia_2019-04-02T05:40:08.259Z] with status [FAILED]
2019-04-02T05:44:54,999 INFO [WorkerTaskManager-CompletedTasksCleaner] org.apache.druid.indexing.worker.WorkerTaskManager - Deleting completed task[index_wikipedia_2019-04-02T05:40:08.259Z] information, overlord task status[FAILED].
2019-04-02T05:50:40,942 INFO [forking-task-runner-2] org.apache.druid.indexing.overlord.TaskRunnerUtils - Task [index_wikipedia_2019-04-02T05:50:37.645Z] status changed to [FAILED].
2019-04-02T05:50:40,947 INFO [WorkerTaskManager-NoticeHandler] org.apache.druid.indexing.worker.WorkerTaskManager - Job’s finished. Completed [index_wikipedia_2019-04-02T05:50:37.645Z] with status [FAILED]
2019-04-02T05:54:54,999 INFO [WorkerTaskManager-CompletedTasksCleaner] org.apache.druid.indexing.worker.WorkerTaskManager - Deleting completed task[index_wikipedia_2019-04-02T05:50:37.645Z] information, overlord task status[FAILED].
2019-04-02T06:01:05,306 INFO [forking-task-runner-0] org.apache.druid.indexing.overlord.TaskRunnerUtils - Task [index_wikipedia_2019-04-02T06:01:01.993Z] status changed to [FAILED].
2019-04-02T06:01:05,313 INFO [WorkerTaskManager-NoticeHandler] org.apache.druid.indexing.worker.WorkerTaskManager - Job’s finished. Completed [index_wikipedia_2019-04-02T06:01:01.993Z] with status [FAILED]
2019-04-02T06:04:54,997 INFO [WorkerTaskManager-CompletedTasksCleaner] org.apache.druid.indexing.worker.WorkerTaskManager - Deleting completed task[index_wikipedia_2019-04-02T06:01:01.993Z] information, overlord task status[FAILED].
2019-04-02T06:18:10,546 INFO [forking-task-runner-1] org.apache.druid.indexing.overlord.TaskRunnerUtils - Task [index_wikipedia_2019-04-02T06:18:07.137Z] status changed to [FAILED].
2019-04-02T06:18:10,550 INFO [WorkerTaskManager-NoticeHandler] org.apache.druid.indexing.worker.WorkerTaskManager - Job’s finished. Completed [index_wikipedia_2019-04-02T06:18:07.137Z] with status [FAILED]
2019-04-02T06:19:54,996 INFO [WorkerTaskManager-CompletedTasksCleaner] org.apache.druid.indexing.worker.WorkerTaskManager - Deleting completed task[index_wikipedia_2019-04-02T06:18:07.137Z] information, overlord task status[FAILED].

``

Hi Robert,

The reason for a task failure is supposed to be in the task logs. In the overlord console, do you see a column after “errorMsg”, with log(all) link, could you check the task logs from there and see what’s the error.

Thanks,

Surekha

How do I enable logs in the overlord console? When I select the logs in the console there aren’t any logs saved.

Hi Robert,
In your common.runtime.properties file there is property section showing :

druid.indexer.logs.directory

I believe this should have the log location.

Hope that helps.

Regards,

I have added indexer.logs properties in the the overlord and middle manager runtime.properties but that did not seem work. Is there a difference if I add them on the common.runtime.properties? And if so what service needs to have the common.runtime.properties updated? Here is a copy of the rumtime.properties for the overlord and middlemangagers:

overlord runtime.properties:

druid.service=druid/overlord
druid.plaintextPort=8090

druid.indexer.queue.startDelay=PT30S

druid.indexer.logs.type=file
druid.indexer.runner.type=local
druid.indexer.logs.directory=/tmp/druid-logs

middlemanager runtime.properties:

druid.service=druid/middleManager
druid.plaintextPort=8091

Number of tasks per middleManager

druid.worker.capacity=3

Task launch parameters

druid.indexer.runner.javaOpts=-server -Xmx2g -Duser.timezone=UTC -Dfile.encoding=UTF-8 -XX:+ExitOnOut
druid.indexer.task.baseTaskDir=var/druid/task

HTTP server threads

druid.server.http.numThreads=25

Processing threads and buffers on Peons

druid.indexer.fork.property.druid.processing.buffer.sizeBytes=536870912
druid.indexer.fork.property.druid.processing.numThreads=2

Hadoop indexing

druid.indexer.task.hadoopWorkingPath=var/druid/hadoop-tmp

druid.selectors.indexing.serviceName=druid:overlord
druid.indexer.logs.type=file
druid.indexer.logs.directory=/tmp/druid-logs
druid.indexer.runner.type=local

Hi Robert,
It should be in the common.runtime.properties file. This config file should be in all your nodes that have druid services running.

Regards,

Robert

After finally getting the task logs working correctly, I was able to run the task successfully, but then ran into a new set of issues :frowning:

On the coordinator console the datasouce does not move from the red status. Here is the log from the historical node. And here is my ingestion spec.

{

“type” : “index”,

“spec” : {

“dataSchema” : {

“dataSource” : “wikipedia”,

“parser” : {

“type” : “string”,

“parseSpec” : {

“format” : “json”,

“dimensionsSpec” : {

“dimensions” : [

“channel”,

“cityName”,

“comment”,

“countryIsoCode”,

“countryName”,

“isAnonymous”,

“isMinor”,

“isNew”,

“isRobot”,

“isUnpatrolled”,

“metroCode”,

“namespace”,

“page”,

“regionIsoCode”,

“regionName”,

“user”,

{ “name”: “added”, “type”: “long” },

{ “name”: “deleted”, “type”: “long” },

{ “name”: “delta”, “type”: “long” }

]

},

“timestampSpec”: {

“column”: “time”,

“format”: “iso”

}

}

},

“metricsSpec” : ,

“granularitySpec” : {

“type” : “uniform”,

“segmentGranularity” : “day”,

“queryGranularity” : “none”,

“intervals” : [“2015-09-12/2015-09-13”],

“rollup” : false

}

},

“ioConfig” : {

“type” : “index”,

“firehose” : {

“type” : “local”,

“baseDir” : “quickstart/tutorial/”,

“filter” : “wikiticker-2015-09-12-sampled.json.gz”

},

“appendToExisting” : false

},

“tuningConfig” : {

“type” : “index”,

“maxRowsPerSegment” : 5000000,

“maxRowsInMemory” : 25000

}

}

}

And here is the logs from the historical service:

2019-04-08T05:48:45,766 INFO [main] org.eclipse.jetty.server.handler.ContextHandler - Started o.e.j.s.ServletContextHandler@17216605{/,null,AVAILABLE}
2019-04-08T05:48:45,776 INFO [main] org.eclipse.jetty.server.AbstractConnector - Started ServerConnector@4f5c30b1{HTTP/1.1,[http/1.1]}{0.0.0.0:8083}
2019-04-08T05:48:45,777 INFO [main] org.eclipse.jetty.server.Server - Started @9584ms
2019-04-08T05:48:45,777 INFO [main] org.apache.druid.java.util.common.lifecycle.Lifecycle - Starting lifecycle [module] stage [ANNOUNCEMENTS]
2019-04-08T05:48:45,778 INFO [main] org.apache.druid.java.util.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void org.apache.druid.curator.announcement.Announcer.start()] on object[org.apache.druid.curator.announcement.Announcer@62e73ab6].
2019-04-08T05:48:45,779 INFO [main] org.apache.druid.curator.announcement.Announcer - Starting announcer
2019-04-08T05:48:45,850 INFO [main] org.apache.druid.curator.discovery.CuratorDruidNodeAnnouncer - Announcing [DiscoveryDruidNode{druidNode=DruidNode{serviceName=‘druid/historical’, host=‘172.19.0.7’, bindOnHost=false, port=-1, plaintextPort=8083, enablePlaintextPort=true, tlsPort=-1, enableTlsPort=false}, nodeType=‘HISTORICAL’, services={dataNodeService=DataNodeService{tier=’_default_tier’, maxSize=130000000000, type=historical, priority=0}, lookupNodeService=LookupNodeService{lookupTier=’__default’}}}].
2019-04-08T05:48:45,875 INFO [main] org.apache.druid.curator.discovery.CuratorDruidNodeAnnouncer - Announced [DiscoveryDruidNode{druidNode=DruidNode{serviceName=‘druid/historical’, host=‘172.19.0.7’, bindOnHost=false, port=-1, plaintextPort=8083, enablePlaintextPort=true, tlsPort=-1, enableTlsPort=false}, nodeType=‘HISTORICAL’, services={dataNodeService=DataNodeService{tier=’_default_tier’, maxSize=130000000000, type=historical, priority=0}, lookupNodeService=LookupNodeService{lookupTier=’__default’}}}].
2019-04-08T05:48:45,876 INFO [main] org.apache.druid.java.util.common.lifecycle.Lifecycle - Successfully started lifecycle [module]
2019-04-08T05:49:40,714 INFO [ZkCoordinator] org.apache.druid.server.coordination.ZkCoordinator - New request[LOAD: wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-04-08T05:49:09.110Z] with zNode[/druid/loadQueue/172.19.0.7:8083/wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-04-08T05:49:09.110Z].
2019-04-08T05:49:40,730 INFO [ZkCoordinator] org.apache.druid.server.coordination.SegmentLoadDropHandler - Loading segment wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-04-08T05:49:09.110Z
2019-04-08T05:49:40,760 WARN [ZkCoordinator] org.apache.druid.server.coordination.BatchDataSegmentAnnouncer - No path to unannounce segment[wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-04-08T05:49:09.110Z]
2019-04-08T05:49:40,762 INFO [ZkCoordinator] org.apache.druid.server.SegmentManager - Told to delete a queryable for a dataSource[wikipedia] that doesn’t exist.
2019-04-08T05:49:40,763 INFO [ZkCoordinator] org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager - Deleting directory[var/druid/segment-cache/wikipedia/2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z/2019-04-08T05:49:09.110Z/0]
2019-04-08T05:49:40,769 INFO [ZkCoordinator] org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager - Deleting directory[var/druid/segment-cache/wikipedia/2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z/2019-04-08T05:49:09.110Z]
2019-04-08T05:49:40,769 INFO [ZkCoordinator] org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager - Deleting directory[var/druid/segment-cache/wikipedia/2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z]
2019-04-08T05:49:40,770 INFO [ZkCoordinator] org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager - Deleting directory[var/druid/segment-cache/wikipedia]
2019-04-08T05:49:40,770 WARN [ZkCoordinator] org.apache.druid.server.coordination.SegmentLoadDropHandler - Unable to delete segmentInfoCacheFile[var/druid/segment-cache/info_dir/wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-04-08T05:49:09.110Z]
2019-04-08T05:49:40,795 ERROR [ZkCoordinator] org.apache.druid.server.coordination.SegmentLoadDropHandler - Failed to load segment for dataSource: {class=org.apache.druid.server.coordination.SegmentLoadDropHandler, exceptionType=class org.apache.druid.segment.loading.SegmentLoadingException, exceptionMessage=Exception loading segment[wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-04-08T05:49:09.110Z], segment=DataSegment{size=4821529, shardSpec=NumberedShardSpec{partitionNum=0, partitions=0}, metrics=, dimensions=[channel, cityName, comment, countryIsoCode, countryName, isAnonymous, isMinor, isNew, isRobot, isUnpatrolled, metroCode, namespace, page, regionIsoCode, regionName, user, added, deleted, delta], version=‘2019-04-08T05:49:09.110Z’, loadSpec={type=>local, path=>/opt/apache-druid-0.15.0-incubating-SNAPSHOT/var/druid/segments/wikipedia/2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z/2019-04-08T05:49:09.110Z/0/index.zip}, interval=2015-09-12T00:00:00.000Z/2015-09-13T00:00:00.000Z, dataSource=‘wikipedia’, binaryVersion=‘9’}}
org.apache.druid.segment.loading.SegmentLoadingException: Exception loading segment[wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-04-08T05:49:09.110Z]
at org.apache.druid.server.coordination.SegmentLoadDropHandler.loadSegment(SegmentLoadDropHandler.java:268) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.server.coordination.SegmentLoadDropHandler.addSegment(SegmentLoadDropHandler.java:312) [druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.server.coordination.SegmentChangeRequestLoad.go(SegmentChangeRequestLoad.java:47) [druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.server.coordination.ZkCoordinator$1.childEvent(ZkCoordinator.java:118) [druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.curator.framework.recipes.cache.PathChildrenCache$5.apply(PathChildrenCache.java:538) [curator-recipes-4.1.0.jar:4.1.0]
at org.apache.curator.framework.recipes.cache.PathChildrenCache$5.apply(PathChildrenCache.java:532) [curator-recipes-4.1.0.jar:4.1.0]
at org.apache.curator.framework.listen.ListenerContainer$1.run(ListenerContainer.java:93) [curator-framework-4.1.0.jar:4.1.0]
at org.apache.curator.shaded.com.google.common.util.concurrent.MoreExecutors$DirectExecutor.execute(MoreExecutors.java:435) [curator-client-4.1.0.jar:?]
at org.apache.curator.framework.listen.ListenerContainer.forEach(ListenerContainer.java:85) [curator-framework-4.1.0.jar:4.1.0]
at org.apache.curator.framework.recipes.cache.PathChildrenCache.callListeners(PathChildrenCache.java:530) [curator-recipes-4.1.0.jar:4.1.0]
at org.apache.curator.framework.recipes.cache.EventOperation.invoke(EventOperation.java:35) [curator-recipes-4.1.0.jar:4.1.0]
at org.apache.curator.framework.recipes.cache.PathChildrenCache$9.run(PathChildrenCache.java:808) [curator-recipes-4.1.0.jar:4.1.0]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_212]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_212]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_212]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_212]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_212]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_212]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_212]
Caused by: java.lang.IllegalArgumentException: Instantiation of [simple type, class org.apache.druid.segment.loading.LocalLoadSpec] value failed: [/opt/apache-druid-0.15.0-incubating-SNAPSHOT/var/druid/segments/wikipedia/2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z/2019-04-08T05:49:09.110Z/0/index.zip] does not exist
at com.fasterxml.jackson.databind.ObjectMapper._convert(ObjectMapper.java:3459) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.ObjectMapper.convertValue(ObjectMapper.java:3378) ~[jackson-databind-2.6.7.jar:2.6.7]
at org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager.loadInLocation(SegmentLoaderLocalCacheManager.java:235) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager.loadInLocationWithStartMarker(SegmentLoaderLocalCacheManager.java:224) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager.loadSegmentWithRetry(SegmentLoaderLocalCacheManager.java:187) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager.getSegmentFiles(SegmentLoaderLocalCacheManager.java:164) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager.getSegment(SegmentLoaderLocalCacheManager.java:131) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.server.SegmentManager.getAdapter(SegmentManager.java:196) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.server.SegmentManager.loadSegment(SegmentManager.java:157) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.server.coordination.SegmentLoadDropHandler.loadSegment(SegmentLoadDropHandler.java:264) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
… 18 more
Caused by: com.fasterxml.jackson.databind.JsonMappingException: Instantiation of [simple type, class org.apache.druid.segment.loading.LocalLoadSpec] value failed: [/opt/apache-druid-0.15.0-incubating-SNAPSHOT/var/druid/segments/wikipedia/2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z/2019-04-08T05:49:09.110Z/0/index.zip] does not exist
at com.fasterxml.jackson.databind.deser.std.StdValueInstantiator.wrapException(StdValueInstantiator.java:399) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.deser.std.StdValueInstantiator.createFromObjectWith(StdValueInstantiator.java:231) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:135) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:442) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1099) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:296) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeOther(BeanDeserializer.java:166) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:136) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.jsontype.impl.AsPropertyTypeDeserializer._deserializeTypedForId(AsPropertyTypeDeserializer.java:122) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.jsontype.impl.AsPropertyTypeDeserializer.deserializeTypedFromObject(AsPropertyTypeDeserializer.java:93) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.deser.AbstractDeserializer.deserializeWithType(AbstractDeserializer.java:131) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.deser.impl.TypeWrappedDeserializer.deserialize(TypeWrappedDeserializer.java:42) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.ObjectMapper._convert(ObjectMapper.java:3454) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.ObjectMapper.convertValue(ObjectMapper.java:3378) ~[jackson-databind-2.6.7.jar:2.6.7]
at org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager.loadInLocation(SegmentLoaderLocalCacheManager.java:235) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager.loadInLocationWithStartMarker(SegmentLoaderLocalCacheManager.java:224) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager.loadSegmentWithRetry(SegmentLoaderLocalCacheManager.java:187) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager.getSegmentFiles(SegmentLoaderLocalCacheManager.java:164) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager.getSegment(SegmentLoaderLocalCacheManager.java:131) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.server.SegmentManager.getAdapter(SegmentManager.java:196) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.server.SegmentManager.loadSegment(SegmentManager.java:157) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.server.coordination.SegmentLoadDropHandler.loadSegment(SegmentLoadDropHandler.java:264) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
… 18 more
Caused by: java.lang.IllegalArgumentException: [/opt/apache-druid-0.15.0-incubating-SNAPSHOT/var/druid/segments/wikipedia/2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z/2019-04-08T05:49:09.110Z/0/index.zip] does not exist
at com.google.common.base.Preconditions.checkArgument(Preconditions.java:148) ~[guava-16.0.1.jar:?]
at org.apache.druid.segment.loading.LocalLoadSpec.(LocalLoadSpec.java:51) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_212]
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_212]
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_212]
at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_212]
at com.fasterxml.jackson.databind.introspect.AnnotatedConstructor.call(AnnotatedConstructor.java:125) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.deser.std.StdValueInstantiator.createFromObjectWith(StdValueInstantiator.java:227) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:135) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:442) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1099) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:296) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeOther(BeanDeserializer.java:166) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:136) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.jsontype.impl.AsPropertyTypeDeserializer._deserializeTypedForId(AsPropertyTypeDeserializer.java:122) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.jsontype.impl.AsPropertyTypeDeserializer.deserializeTypedFromObject(AsPropertyTypeDeserializer.java:93) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.deser.AbstractDeserializer.deserializeWithType(AbstractDeserializer.java:131) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.deser.impl.TypeWrappedDeserializer.deserialize(TypeWrappedDeserializer.java:42) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.ObjectMapper._convert(ObjectMapper.java:3454) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.ObjectMapper.convertValue(ObjectMapper.java:3378) ~[jackson-databind-2.6.7.jar:2.6.7]
at org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager.loadInLocation(SegmentLoaderLocalCacheManager.java:235) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager.loadInLocationWithStartMarker(SegmentLoaderLocalCacheManager.java:224) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager.loadSegmentWithRetry(SegmentLoaderLocalCacheManager.java:187) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager.getSegmentFiles(SegmentLoaderLocalCacheManager.java:164) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager.getSegment(SegmentLoaderLocalCacheManager.java:131) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.server.SegmentManager.getAdapter(SegmentManager.java:196) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.server.SegmentManager.loadSegment(SegmentManager.java:157) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.server.coordination.SegmentLoadDropHandler.loadSegment(SegmentLoadDropHandler.java:264) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
… 18 more
2019-04-08T05:49:40,842 INFO [ZkCoordinator] org.apache.druid.server.coordination.ZkCoordinator - Completed request [LOAD: wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-04-08T05:49:09.110Z]
2019-04-08T05:49:40,842 INFO [ZkCoordinator] org.apache.druid.server.coordination.ZkCoordinator - zNode[/druid/loadQueue/172.19.0.7:8083/wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-04-08T05:49:09.110Z] was removed
2019-04-08T05:50:10,500 INFO [ZkCoordinator] org.apache.druid.server.coordination.ZkCoordinator - New request[LOAD: wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-04-08T05:49:09.110Z] with zNode[/druid/loadQueue/172.19.0.7:8083/wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-04-08T05:49:09.110Z].
2019-04-08T05:50:10,500 INFO [ZkCoordinator] org.apache.druid.server.coordination.SegmentLoadDropHandler - Loading segment wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-04-08T05:49:09.110Z
2019-04-08T05:50:10,503 WARN [ZkCoordinator] org.apache.druid.server.coordination.BatchDataSegmentAnnouncer - No path to unannounce segment[wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-04-08T05:49:09.110Z]
2019-04-08T05:50:10,503 INFO [ZkCoordinator] org.apache.druid.server.SegmentManager - Told to delete a queryable for a dataSource[wikipedia] that doesn’t exist.
2019-04-08T05:50:10,503 INFO [ZkCoordinator] org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager - Deleting directory[var/druid/segment-cache/wikipedia/2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z/2019-04-08T05:49:09.110Z/0]
2019-04-08T05:50:10,503 INFO [ZkCoordinator] org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager - Deleting directory[var/druid/segment-cache/wikipedia/2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z/2019-04-08T05:49:09.110Z]
2019-04-08T05:50:10,504 INFO [ZkCoordinator] org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager - Deleting directory[var/druid/segment-cache/wikipedia/2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z]
2019-04-08T05:50:10,504 INFO [ZkCoordinator] org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager - Deleting directory[var/druid/segment-cache/wikipedia]
2019-04-08T05:50:10,504 WARN [ZkCoordinator] org.apache.druid.server.coordination.SegmentLoadDropHandler - Unable to delete segmentInfoCacheFile[var/druid/segment-cache/info_dir/wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-04-08T05:49:09.110Z]
2019-04-08T05:50:10,506 ERROR [ZkCoordinator] org.apache.druid.server.coordination.SegmentLoadDropHandler - Failed to load segment for dataSource: {class=org.apache.druid.server.coordination.SegmentLoadDropHandler, exceptionType=class org.apache.druid.segment.loading.SegmentLoadingException, exceptionMessage=Exception loading segment[wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-04-08T05:49:09.110Z], segment=DataSegment{size=4821529, shardSpec=NumberedShardSpec{partitionNum=0, partitions=0}, metrics=, dimensions=[channel, cityName, comment, countryIsoCode, countryName, isAnonymous, isMinor, isNew, isRobot, isUnpatrolled, metroCode, namespace, page, regionIsoCode, regionName, user, added, deleted, delta], version=‘2019-04-08T05:49:09.110Z’, loadSpec={type=>local, path=>/opt/apache-druid-0.15.0-incubating-SNAPSHOT/var/druid/segments/wikipedia/2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z/2019-04-08T05:49:09.110Z/0/index.zip}, interval=2015-09-12T00:00:00.000Z/2015-09-13T00:00:00.000Z, dataSource=‘wikipedia’, binaryVersion=‘9’}}
org.apache.druid.segment.loading.SegmentLoadingException: Exception loading segment[wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-04-08T05:49:09.110Z]
at org.apache.druid.server.coordination.SegmentLoadDropHandler.loadSegment(SegmentLoadDropHandler.java:268) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.server.coordination.SegmentLoadDropHandler.addSegment(SegmentLoadDropHandler.java:312) [druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.server.coordination.SegmentChangeRequestLoad.go(SegmentChangeRequestLoad.java:47) [druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.server.coordination.ZkCoordinator$1.childEvent(ZkCoordinator.java:118) [druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.curator.framework.recipes.cache.PathChildrenCache$5.apply(PathChildrenCache.java:538) [curator-recipes-4.1.0.jar:4.1.0]
at org.apache.curator.framework.recipes.cache.PathChildrenCache$5.apply(PathChildrenCache.java:532) [curator-recipes-4.1.0.jar:4.1.0]
at org.apache.curator.framework.listen.ListenerContainer$1.run(ListenerContainer.java:93) [curator-framework-4.1.0.jar:4.1.0]
at org.apache.curator.shaded.com.google.common.util.concurrent.MoreExecutors$DirectExecutor.execute(MoreExecutors.java:435) [curator-client-4.1.0.jar:?]
at org.apache.curator.framework.listen.ListenerContainer.forEach(ListenerContainer.java:85) [curator-framework-4.1.0.jar:4.1.0]
at org.apache.curator.framework.recipes.cache.PathChildrenCache.callListeners(PathChildrenCache.java:530) [curator-recipes-4.1.0.jar:4.1.0]
at org.apache.curator.framework.recipes.cache.EventOperation.invoke(EventOperation.java:35) [curator-recipes-4.1.0.jar:4.1.0]
at org.apache.curator.framework.recipes.cache.PathChildrenCache$9.run(PathChildrenCache.java:808) [curator-recipes-4.1.0.jar:4.1.0]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_212]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_212]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_212]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_212]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_212]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_212]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_212]
Caused by: java.lang.IllegalArgumentException: Instantiation of [simple type, class org.apache.druid.segment.loading.LocalLoadSpec] value failed: [/opt/apache-druid-0.15.0-incubating-SNAPSHOT/var/druid/segments/wikipedia/2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z/2019-04-08T05:49:09.110Z/0/index.zip] does not exist
at com.fasterxml.jackson.databind.ObjectMapper._convert(ObjectMapper.java:3459) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.ObjectMapper.convertValue(ObjectMapper.java:3378) ~[jackson-databind-2.6.7.jar:2.6.7]
at org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager.loadInLocation(SegmentLoaderLocalCacheManager.java:235) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager.loadInLocationWithStartMarker(SegmentLoaderLocalCacheManager.java:224) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager.loadSegmentWithRetry(SegmentLoaderLocalCacheManager.java:187) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager.getSegmentFiles(SegmentLoaderLocalCacheManager.java:164) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager.getSegment(SegmentLoaderLocalCacheManager.java:131) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.server.SegmentManager.getAdapter(SegmentManager.java:196) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.server.SegmentManager.loadSegment(SegmentManager.java:157) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.server.coordination.SegmentLoadDropHandler.loadSegment(SegmentLoadDropHandler.java:264) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
… 18 more
Caused by: com.fasterxml.jackson.databind.JsonMappingException: Instantiation of [simple type, class org.apache.druid.segment.loading.LocalLoadSpec] value failed: [/opt/apache-druid-0.15.0-incubating-SNAPSHOT/var/druid/segments/wikipedia/2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z/2019-04-08T05:49:09.110Z/0/index.zip] does not exist
at com.fasterxml.jackson.databind.deser.std.StdValueInstantiator.wrapException(StdValueInstantiator.java:399) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.deser.std.StdValueInstantiator.createFromObjectWith(StdValueInstantiator.java:231) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:135) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:442) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1099) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:296) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeOther(BeanDeserializer.java:166) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:136) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.jsontype.impl.AsPropertyTypeDeserializer._deserializeTypedForId(AsPropertyTypeDeserializer.java:122) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.jsontype.impl.AsPropertyTypeDeserializer.deserializeTypedFromObject(AsPropertyTypeDeserializer.java:93) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.deser.AbstractDeserializer.deserializeWithType(AbstractDeserializer.java:131) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.deser.impl.TypeWrappedDeserializer.deserialize(TypeWrappedDeserializer.java:42) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.ObjectMapper._convert(ObjectMapper.java:3454) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.ObjectMapper.convertValue(ObjectMapper.java:3378) ~[jackson-databind-2.6.7.jar:2.6.7]
at org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager.loadInLocation(SegmentLoaderLocalCacheManager.java:235) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager.loadInLocationWithStartMarker(SegmentLoaderLocalCacheManager.java:224) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager.loadSegmentWithRetry(SegmentLoaderLocalCacheManager.java:187) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager.getSegmentFiles(SegmentLoaderLocalCacheManager.java:164) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager.getSegment(SegmentLoaderLocalCacheManager.java:131) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.server.SegmentManager.getAdapter(SegmentManager.java:196) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.server.SegmentManager.loadSegment(SegmentManager.java:157) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.server.coordination.SegmentLoadDropHandler.loadSegment(SegmentLoadDropHandler.java:264) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
… 18 more
Caused by: java.lang.IllegalArgumentException: [/opt/apache-druid-0.15.0-incubating-SNAPSHOT/var/druid/segments/wikipedia/2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z/2019-04-08T05:49:09.110Z/0/index.zip] does not exist
at com.google.common.base.Preconditions.checkArgument(Preconditions.java:148) ~[guava-16.0.1.jar:?]
at org.apache.druid.segment.loading.LocalLoadSpec.(LocalLoadSpec.java:51) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_212]
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_212]
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_212]
at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_212]
at com.fasterxml.jackson.databind.introspect.AnnotatedConstructor.call(AnnotatedConstructor.java:125) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.deser.std.StdValueInstantiator.createFromObjectWith(StdValueInstantiator.java:227) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:135) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:442) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1099) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:296) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeOther(BeanDeserializer.java:166) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:136) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.jsontype.impl.AsPropertyTypeDeserializer._deserializeTypedForId(AsPropertyTypeDeserializer.java:122) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.jsontype.impl.AsPropertyTypeDeserializer.deserializeTypedFromObject(AsPropertyTypeDeserializer.java:93) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.deser.AbstractDeserializer.deserializeWithType(AbstractDeserializer.java:131) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.deser.impl.TypeWrappedDeserializer.deserialize(TypeWrappedDeserializer.java:42) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.ObjectMapper._convert(ObjectMapper.java:3454) ~[jackson-databind-2.6.7.jar:2.6.7]
at com.fasterxml.jackson.databind.ObjectMapper.convertValue(ObjectMapper.java:3378) ~[jackson-databind-2.6.7.jar:2.6.7]
at org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager.loadInLocation(SegmentLoaderLocalCacheManager.java:235) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager.loadInLocationWithStartMarker(SegmentLoaderLocalCacheManager.java:224) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager.loadSegmentWithRetry(SegmentLoaderLocalCacheManager.java:187) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager.getSegmentFiles(SegmentLoaderLocalCacheManager.java:164) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager.getSegment(SegmentLoaderLocalCacheManager.java:131) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.server.SegmentManager.getAdapter(SegmentManager.java:196) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.server.SegmentManager.loadSegment(SegmentManager.java:157) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.server.coordination.SegmentLoadDropHandler.loadSegment(SegmentLoadDropHandler.java:264) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
… 18 more
2019-04-08T05:50:10,518 INFO [ZkCoordinator] org.apache.druid.server.coordination.ZkCoordinator - Completed request [LOAD: wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-04-08T05:49:09.110Z]
2019-04-08T05:50:10,518 INFO [ZkCoordinator] org.apache.druid.server.coordination.ZkCoordinator - zNode[/druid/loadQueue/172.19.0.7:8083/wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-04-08T05:49:09.110Z] was removed
2019-04-08T05:50:40,515 INFO [ZkCoordinator] org.apache.druid.server.coordination.ZkCoordinator - New request[LOAD: wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-04-08T05:49:09.110Z] with zNode[/druid/loadQueue/172.19.0.7:8083/wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-04-08T05:49:09.110Z].
2019-04-08T05:50:40,515 INFO [ZkCoordinator] org.apache.druid.server.coordination.SegmentLoadDropHandler - Loading segment wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-04-08T05:49:09.110Z
2019-04-08T05:50:40,516 WARN [ZkCoordinator] org.apache.druid.server.coordination.BatchDataSegmentAnnouncer - No path to unannounce segment[wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-04-08T05:49:09.110Z]
2019-04-08T05:50:40,516 INFO [ZkCoordinator] org.apache.druid.server.SegmentManager - Told to delete a queryable for a dataSource[wikipedia] that doesn’t exist.
2019-04-08T05:50:40,517 INFO [ZkCoordinator] org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager - Deleting directory[var/druid/segment-cache/wikipedia/2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z/2019-04-08T05:49:09.110Z/0]
2019-04-08T05:50:40,517 INFO [ZkCoordinator] org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager - Deleting directory[var/druid/segment-cache/wikipedia/2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z/2019-04-08T05:49:09.110Z]
2019-04-08T05:50:40,517 INFO [ZkCoordinator] org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager - Deleting directory[var/druid/segment-cache/wikipedia/2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z]
2019-04-08T05:50:40,517 INFO [ZkCoordinator] org.apache.druid.segment.loading.SegmentLoaderLocalCacheManager - Deleting directory[var/druid/segment-cache/wikipedia]
2019-04-08T05:50:40,518 WARN [ZkCoordinator] org.apache.druid.server.coordination.SegmentLoadDropHandler - Unable to delete segmentInfoCacheFile[var/druid/segment-cache/info_dir/wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-04-08T05:49:09.110Z]
2019-04-08T05:50:40,518 ERROR [ZkCoordinator] org.apache.druid.server.coordination.SegmentLoadDropHandler - Failed to load segment for dataSource: {class=org.apache.druid.server.coordination.SegmentLoadDropHandler, exceptionType=class org.apache.druid.segment.loading.SegmentLoadingException, exceptionMessage=Exception loading segment[wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-04-08T05:49:09.110Z], segment=DataSegment{size=4821529, shardSpec=NumberedShardSpec{partitionNum=0, partitions=0}, metrics=, dimensions=[channel, cityName, comment, countryIsoCode, countryName, isAnonymous, isMinor, isNew, isRobot, isUnpatrolled, metroCode, namespace, page, regionIsoCode, regionName, user, added, deleted, delta], version=‘2019-04-08T05:49:09.110Z’, loadSpec={type=>local, path=>/opt/apache-druid-0.15.0-incubating-SNAPSHOT/var/druid/segments/wikipedia/2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z/2019-04-08T05:49:09.110Z/0/index.zip}, interval=2015-09-12T00:00:00.000Z/2015-09-13T00:00:00.000Z, dataSource=‘wikipedia’, binaryVersion=‘9’}}
org.apache.druid.segment.loading.SegmentLoadingException: Exception loading segment[wikipedia_2015-09-12T00:00:00.000Z_2015-09-13T00:00:00.000Z_2019-04-08T05:49:09.110Z]
at org.apache.druid.server.coordination.SegmentLoadDropHandler.loadSegment(SegmentLoadDropHandler.java:268) ~[druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.server.coordination.SegmentLoadDropHandler.addSegment(SegmentLoadDropHandler.java:312) [druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.server.coordination.SegmentChangeRequestLoad.go(SegmentChangeRequestLoad.java:47) [druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.druid.server.coordination.ZkCoordinator$1.childEvent(ZkCoordinator.java:118) [druid-server-0.15.0-incubating-SNAPSHOT.jar:0.15.0-incubating-SNAPSHOT]
at org.apache.curator.framework.recipes.cache.PathChildrenCache$5.apply(PathChildrenCache.java:538) [curator-recipes-4.1.0.jar:4.1.0]
at org.apache.curator.framework.recipes.cache.PathChildrenCache$5.apply(PathChildrenCache.java:532) [curator-recipes-4.1.0.jar:4.1.0]
at org.apache.curator.framework.listen.ListenerContainer$1.run(ListenerContainer.java:93) [curator-framework-4.1.0.jar:4.1.0]
at org.apache.curator.shaded.com.google.common.util.concurrent.MoreExecutors$DirectExecutor.execute(MoreExecutors.java:435) [curator-client-4.1.0.jar:?]
at org.apache.curator.framework.listen.ListenerContainer.forEach(ListenerContainer.java:85) [curator-framework-4.1.0.jar:4.1.0]
at org.apache.curator.framework.recipes.cache.PathChildrenCache.callListeners(PathChildrenCache.java:530) [curator-recipes-4.1.0.jar:4.1.0]
at org.apache.curator.framework.recipes.cache.EventOperation.invoke(EventOperation.java:35) [curator-recipes-4.1.0.jar:4.1.0]
at org.apache.curator.framework.recipes.cache.PathChildrenCache$9.run(PathChildrenCache.java:808) [curator-recipes-4.1.0.jar:4.1.0]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_212]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_212]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_212]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_212]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_212]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_212]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_212]