Overlord start up failed

Hi ,

I get the below error when I am trying to submit my s3 index task

2019-03-25T10:42:51,433 ERROR [main] org.apache.druid.cli.CliPeon - Error when starting up. Failing.

com.google.inject.ProvisionException: Unable to provision, see the following errors:

  1. Unknown provider[s3 ] of Key[type=org.apache.druid.segment.loading.DataSegmentPusher, annotation=[none]], known options[[hdfs, s3, local]]

at org.apache.druid.guice.PolyBind.createChoice(PolyBind.java:70) (via modules: com.google.inject.util.Modules$OverrideModule -> com.google.inject.util.Modules$OverrideModule -> org.apache.druid.guice.LocalDataStorageDruidModule)

while locating org.apache.druid.segment.loading.DataSegmentPusher

for the 4th parameter of org.apache.druid.indexing.common.TaskToolboxFactory.<init>(TaskToolboxFactory.java:113)

at org.apache.druid.cli.CliPeon$1.configure(CliPeon.java:200) (via modules: com.google.inject.util.Modules$OverrideModule -> com.google.inject.util.Modules$OverrideModule -> org.apache.druid.cli.CliPeon$1)

while locating org.apache.druid.indexing.common.TaskToolboxFactory

for the 1st parameter of org.apache.druid.indexing.overlord.SingleTaskBackgroundRunner.<init>(SingleTaskBackgroundRunner.java:95)

at org.apache.druid.cli.CliPeon$1.configure(CliPeon.java:239) (via modules: com.google.inject.util.Modules$OverrideModule -> com.google.inject.util.Modules$OverrideModule -> org.apache.druid.cli.CliPeon$1)

while locating org.apache.druid.indexing.overlord.SingleTaskBackgroundRunner

while locating org.apache.druid.indexing.overlord.TaskRunner

for the 4th parameter of org.apache.druid.indexing.worker.executor.ExecutorLifecycle.<init>(ExecutorLifecycle.java:79)

at org.apache.druid.cli.CliPeon$1.configure(CliPeon.java:223) (via modules: com.google.inject.util.Modules$OverrideModule -> com.google.inject.util.Modules$OverrideModule -> org.apache.druid.cli.CliPeon$1)

while locating org.apache.druid.indexing.worker.executor.ExecutorLifecycle

1 error

at com.google.inject.internal.InjectorImpl$2.get(InjectorImpl.java:1028) ~[guice-4.1.0.jar:?]

at com.google.inject.internal.InjectorImpl.getInstance(InjectorImpl.java:1050) ~[guice-4.1.0.jar:?]

at org.apache.druid.guice.LifecycleModule$2.start(LifecycleModule.java:132) ~[druid-api-0.13.0-incubating.jar:0.13.0-incubating]

at org.apache.druid.cli.GuiceRunnable.initLifecycle(GuiceRunnable.java:107) [druid-services-0.13.0-incubating.jar:0.13.0-incubating]

at org.apache.druid.cli.CliPeon.run(CliPeon.java:348) [druid-services-0.13.0-incubating.jar:0.13.0-incubating]

at org.apache.druid.cli.Main.main(Main.java:118) [druid-services-0.13.0-incubating.jar:0.13.0-incubating]

Please help

Thanks,

Anoosha

Can you paste here your loadList and S3 properties from common.properties?

Rommel Garcia

Director, Field Engineering

Hi Rommel ,

Sure.

Here it is :

druid.extensions.loadList=[“druid-hdfs-storage”, “druid-kafka-indexing-service”, “druid-s3-extensions”]

For S3:

druid.storage.type=s3

druid.storage.bucket=testanoosha

druid.storage.baseKey=druid/segments

druid.s3.accessKey=myaccesskey

druid.s3.secretKey=mysecretkey

druid.storage.storageDirectory=s3://testanoosha/druid/segments

Thanks
Anoosha

Hi All ,

This issue got solved

As the error has showed “Unknown provider[s3 ] of Key[type=org.apache.druid.segment.loading.DataSegmentPusher, annotation=[none]], known options[[hdfs, s3, local]]”

so the there was some extra spaces in the s3 property values I have given common.properties files

thanks,

Anoosha