Exception While loading Segments from S3

Hi Guys,

I am using S3 ad Deep Storage.While loading segment on Historical node I am facing below exception.

config:

Seg. Loader

druid.storage.type=s3_zip

druid.s3.accessKey=xxx

druid.s3.secretKey=xxx

druid.storage.bucket=xxx

druid.storage.baseKey=xxx

druid.storage.disableAcl=true

#druid.storage.archiveBucket=xxx

druid.storage.archiveBucket=xxx

druid.storage.archiveBaseKey=xxx

Is anything wrong in config?

In DB segment loadSpec type is “s3_zip” in payload column.Then Why it is showing "Unknown loader type[s3_zip]. Known types are [local] "

io.druid.segment.loading.SegmentLoadingException: Exception loading segment[ec2_test_cluster_2015-03-22T00:00:00.000Z_2015-03-22T00:01:00.000Z_2015-03-24T10:12:04.430Z]

at io.druid.server.coordination.ZkCoordinator.loadSegment(ZkCoordinator.java:138) ~[druid-services-0.7.0-selfcontained.jar:0.7.0]

at io.druid.server.coordination.ZkCoordinator.addSegment(ZkCoordinator.java:163) [druid-services-0.7.0-selfcontained.jar:0.7.0]

at io.druid.server.coordination.SegmentChangeRequestLoad.go(SegmentChangeRequestLoad.java:42) [druid-services-0.7.0-selfcontained.jar:0.7.0]

at io.druid.server.coordination.BaseZkCoordinator$1.childEvent(BaseZkCoordinator.java:125) [druid-services-0.7.0-selfcontained.jar:0.7.0]

at org.apache.curator.framework.recipes.cache.PathChildrenCache$5.apply(PathChildrenCache.java:516) [druid-services-0.7.0-selfcontained.jar:0.7.0]

at org.apache.curator.framework.recipes.cache.PathChildrenCache$5.apply(PathChildrenCache.java:510) [druid-services-0.7.0-selfcontained.jar:0.7.0]

at org.apache.curator.framework.listen.ListenerContainer$1.run(ListenerContainer.java:92) [druid-services-0.7.0-selfcontained.jar:0.7.0]

at com.google.common.util.concurrent.MoreExecutors$SameThreadExecutorService.execute(MoreExecutors.java:297) [druid-services-0.7.0-selfcontained.jar:0.7.0]

at org.apache.curator.framework.listen.ListenerContainer.forEach(ListenerContainer.java:83) [druid-services-0.7.0-selfcontained.jar:0.7.0]

at org.apache.curator.framework.recipes.cache.PathChildrenCache.callListeners(PathChildrenCache.java:507) [druid-services-0.7.0-selfcontained.jar:0.7.0]

at org.apache.curator.framework.recipes.cache.EventOperation.invoke(EventOperation.java:35) [druid-services-0.7.0-selfcontained.jar:0.7.0]

at org.apache.curator.framework.recipes.cache.PathChildrenCache$9.run(PathChildrenCache.java:759) [druid-services-0.7.0-selfcontained.jar:0.7.0]

at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) [?:1.7.0_75]

at java.util.concurrent.FutureTask.run(FutureTask.java:262) [?:1.7.0_75]

at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) [?:1.7.0_75]

at java.util.concurrent.FutureTask.run(FutureTask.java:262) [?:1.7.0_75]

at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) [?:1.7.0_75]

at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) [?:1.7.0_75]

at java.lang.Thread.run(Thread.java:745) [?:1.7.0_75]

Caused by: io.druid.segment.loading.SegmentLoadingException: Unknown loader type[s3_zip]. Known types are [local]

at io.druid.segment.loading.OmniSegmentLoader.getPuller(OmniSegmentLoader.java:188) ~[druid-services-0.7.0-selfcontained.jar:0.7.0]

at io.druid.segment.loading.OmniSegmentLoader.getSegmentFiles(OmniSegmentLoader.java:137) ~[druid-services-0.7.0-selfcontained.jar:0.7.0]

at io.druid.segment.loading.OmniSegmentLoader.getSegment(OmniSegmentLoader.java:93) ~[druid-services-0.7.0-selfcontained.jar:0.7.0]

at io.druid.server.coordination.ServerManager.loadSegment(ServerManager.java:150) ~[druid-services-0.7.0-selfcontained.jar:0.7.0]

at io.druid.server.coordination.ZkCoordinator.loadSegment(ZkCoordinator.java:134) ~[druid-services-0.7.0-selfcontained.jar:0.7.0]

… 18 more

Thanks.

Jitesh

That looks like a case where the S3 extension is not loaded.

Make sure to include the s3 extension in your common configuration.

Thanks charles and Fangin .

I forget to include the s3 extension in common configuration.

Now Its running fine.

Thnaks.

Jitesh