segments not loaded to S3

Hi Guys,

I am running realtime and historical node in separate instances and using deep storage as S3. Observing that data not handover to S3, beyond window period. Below is the error in historical node.

Also,have configured the druid.storage.storageDirectory as /mnt/data/druid/localStorage under historical runtime.properties, but still the meta data in DB is referring /tmp folder.

Also,observing there is no data being created indexCache/info_dir path in historical node.

Please guide me on solving this. Thanks in advance.

Historical configuration is as below

Any thoughts, could you please help. I am missing any configurations.

One other thing to add is, in realtime node the data i could see some index files written under /tmp/druid/localStorage. The same path is put to DB. But my config file is pointng to /mnt/data/druid.

It appears you have not correctly configured things in your indexing. Are you making sure to include the common runtime.properties in the classpath of every node? Did you update the deep storage configs in the common runtime.properties?

Thanks Fangjin, in common.runtime.properties local path ( druid.storage.storageDirectory) was not specified, now i done it. It writes to configured folder. But facing below issues. Please help

1.Currently running with single instance running all nodes, but segments are in local server only, data not moved to S3. Attaching the configuration files and logs for coordinator and historical.
Window period have configured is PT20M.

2.When a bring another fresh historical node alone, getting below error since it looks up local path for files and it fails.

2015-10-12T14:02:12,371 ERROR [ZkCoordinator-0] io.druid.server.coordination.ZkCoordinator - Failed to load segment for dataSource: {class=io.druid.server.coordination.ZkCoordinator, exceptionType=class io.druid.segment.loading.SegmentLoadingException, exceptionMessage=Exception loading segment[WiFiMXDruid_2015-10-12T09:14:00.000Z_2015-10-12T09:15:00.000Z_2015-10-12T09:14:00.000Z], segment=DataSegment{size=650792, shardSpec=NoneShardSpec, metrics=, dimensions=[actionName, actionType, age, apMacAddress, customer, day, deviceModel, deviceOS, experienceZone, gender, latitude, longitude, macAddress, pageName, ssid, subscriberId, user, userId], version=‘2015-10-12T09:14:00.000Z’, loadSpec={type=local, path=/mnt/data/druid/localStorage/WiFiMXDruid/2015-10-12T09:14:00.000Z_2015-10-12T09:15:00.000Z/2015-10-12T09:14:00.000Z/0/index.zip}, interval=2015-10-12T09:14:00.000Z/2015-10-12T09:15:00.000Z, dataSource=‘WiFiMXDruid’, binaryVersion=‘9’}}
io.druid.segment.loading.SegmentLoadingException: Exception loading segment[WiFiMXDruid_2015-10-12T09:14:00.000Z_2015-10-12T09:15:00.000Z_2015-10-12T09:14:00.000Z]
at io.druid.server.coordination.ZkCoordinator.loadSegment(ZkCoordinator.java:146) ~[druid-server-0.8.0.jar:0.8.0]
at io.druid.server.coordination.ZkCoordinator.addSegment(ZkCoordinator.java:171) [druid-server-0.8.0.jar:0.8.0]
at io.druid.server.coordination.SegmentChangeRequestLoad.go(SegmentChangeRequestLoad.java:42) [druid-server-0.8.0.jar:0.8.0]
at io.druid.server.coordination.BaseZkCoordinator$1.childEvent(BaseZkCoordinator.java:121) [druid-server-0.8.0.jar:0.8.0]
at org.apache.curator.framework.recipes.cache.PathChildrenCache$5.apply(PathChildrenCache.java:516) [curator-recipes-2.7.0.jar:?]
at org.apache.curator.framework.recipes.cache.PathChildrenCache$5.apply(PathChildrenCache.java:510) [curator-recipes-2.7.0.jar:?]
at org.apache.curator.framework.listen.ListenerContainer$1.run(ListenerContainer.java:92) [curator-framework-2.7.0.jar:?]
at com.google.common.util.concurrent.MoreExecutors$SameThreadExecutorService.execute(MoreExecutors.java:297) [guava-16.0.1.jar:?]
at org.apache.curator.framework.listen.ListenerContainer.forEach(ListenerContainer.java:83) [curator-framework-2.7.0.jar:?]
at org.apache.curator.framework.recipes.cache.PathChildrenCache.callListeners(PathChildrenCache.java:507) [curator-recipes-2.7.0.jar:?]
at org.apache.curator.framework.recipes.cache.EventOperation.invoke(EventOperation.java:35) [curator-recipes-2.7.0.jar:?]
at org.apache.curator.framework.recipes.cache.PathChildrenCache$9.run(PathChildrenCache.java:759) [curator-recipes-2.7.0.jar:?]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) [?:1.7.0_85]
at java.util.concurrent.FutureTask.run(FutureTask.java:262) [?:1.7.0_85]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) [?:1.7.0_85]
at java.util.concurrent.FutureTask.run(FutureTask.java:262) [?:1.7.0_85]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) [?:1.7.0_85]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) [?:1.7.0_85]
at java.lang.Thread.run(Thread.java:745) [?:1.7.0_85]
Caused by: java.lang.IllegalArgumentException: Instantiation of [simple type, class io.druid.segment.loading.LocalLoadSpec] value failed: [/mnt/data/druid/localStorage/WiFiMXDruid/2015-10-12T09:14:00.000Z_2015-10-12T09:15:00.000Z/2015-10-12T09:14:00.000Z/0/index.zip] does not exist
at com.fasterxml.jackson.databind.ObjectMapper._convert(ObjectMapper.java:2774) ~[jackson-databind-2.4.4.jar:2.4.4]
at com.fasterxml.jackson.databind.ObjectMapper.convertValue(ObjectMapper.java:2700) ~[jackson-databind-2.4.4.jar:2.4.4]
at io.druid.segment.loading.SegmentLoaderLocalCacheManager.getSegmentFiles(SegmentLoaderLocalCacheManager.java:140) ~[druid-server-0.8.0.jar:0.8.0]
at io.druid.segment.loading.SegmentLoaderLocalCacheManager.getSegment(SegmentLoaderLocalCacheManager.java:93) ~[druid-server-0.8.0.jar:0.8.0]
at io.druid.server.coordination.ServerManager.loadSegment(ServerManager.java:151) ~[druid-server-0.8.0.jar:0.8.0]
at io.druid.server.coordination.ZkCoordinator.loadSegment(ZkCoordinator.java:142) ~[druid-server-0.8.0.jar:0.8.0]
… 18 more

historical.log (5.49 KB)

coordinator.log (1.65 KB)

Configuration-files.txt (1.06 KB)

Realtime logs are as below.

2015-10-12T10:38:00,167 INFO [WiFiMXDruid-overseer-0] io.druid.segment.realtime.plumber.RealtimePlumber - Found [6,317] sinks to persist and merge
2015-10-12T10:39:00,172 INFO [WiFiMXDruid-overseer-0] io.druid.segment.realtime.plumber.RealtimePlumber - Found [6,317] sinks to persist and merge

2015-10-13T03:42:02,996 INFO [chief-WiFiMXDruid[0]] io.druid.server.coordination.BatchDataSegmentAnnouncer - Announcing segment[WiFiMXDruid_2015-10-13T03:42:00.000Z_2015-10-13T03:43:00.000Z_2015-10-13T03:42:00.000Z] at path[/druid/segments/l-emspnewanalytics100:8080/l-emspnewanalytics100:8080_realtime__default_tier_2015-10-12T09:51:17.354Z_7a5cc46af32e43e9bf46e77818b87bca76]
2015-10-13T03:42:42,127 INFO [plumber_merge_0] io.druid.segment.realtime.plumber.RealtimePlumber - Segment[DataSegment{size=657926, shardSpec=NoneShardSpec, metrics=, dimensions=[actionName, actionType, age, apMacAddress, customer, day, deviceOS, experienceZone, gender, latitude, longitude, macAddress, ssid, subscriberId, user, userId], version=‘2015-10-13T03:20:00.000Z’, loadSpec={type=local, path=/mnt/data/druid/localStorage/WiFiMXDruid/2015-10-13T03:20:00.000Z_2015-10-13T03:21:00.000Z/2015-10-13T03:20:00.000Z/0/index.zip}, interval=2015-10-13T03:20:00.000Z/2015-10-13T03:21:00.000Z, dataSource=‘WiFiMXDruid’, binaryVersion=‘9’}] matches sink[Sink{interval=2015-10-13T03:20:00.000Z/2015-10-13T03:21:00.000Z, schema=io.druid.segment.indexing.DataSchema@64b3dbe}] on server[DruidServerMetadata{name=‘l-emspnewanalytics100:8083’, host=‘l-emspnewanalytics100:8083’, maxSize=10000000000, tier=’_default_tier’, type=‘historical’, priority=‘0’}]
2015-10-13T03:42:42,128 INFO [plumber_merge_0] io.druid.segment.realtime.plumber.RealtimePlumber - Segment version[2015-10-13T03:20:00.000Z] >= sink version[2015-10-13T03:20:00.000Z]
2015-10-13T03:42:42,129 INFO [plumber_merge_0] io.druid.server.coordination.BatchDataSegmentAnnouncer - Unannouncing segment[WiFiMXDruid_2015-10-13T03:20:00.000Z_2015-10-13T03:21:00.000Z_2015-10-13T03:20:00.000Z] at path[/druid/segments/l-emspnewanalytics100:8080/l-emspnewanalytics100:8080_realtime__default_tier_2015-10-12T09:51:17.354Z_7a5cc46af32e43e9bf46e77818b87bca76]
2015-10-13T03:42:42,130 INFO [plumber_merge_0] io.druid.segment.realtime.plumber.RealtimePlumber - Deleting Index File[/mnt/data/druid/realtime/basePersist/WiFiMXDruid/2015-10-13T03:20:00.000Z_2015-10-13T03:21:00.000Z]
2015-10-13T03:42:42,134 INFO [plumber_merge_0] io.druid.segment.realtime.plumber.RealtimePlumber - Removing sinkKey 1444706400000 for segment WiFiMXDruid_2015-10-13T03:20:00.000Z_2015-10-13T03:21:00.000Z_2015-10-13T03:20:00.000Z

Hi,

I have sorted out this issue, S3 extension was not being configured in common.runtime.properties, due to it data was storing to local disk. Upon adding this configuration, data gets uploaded to S3.

-Suresh