can we used insert-segment-to-db Tool with deep storage as google cloud storage.
i want to migrate segments to google cloud storage and i am going to use google cloud storage as deep storage. but it seems insert-segment-to-db Tool works only with hdfs ,S3 and local.
i am getting this error when i am trying with google cloud storage.
Unknown provider[google] of Key[type=io.druid.segment.loading.DataSegmentFinder, annotation=[none]], known options[[local]]
at io.druid.guice.PolyBind.createChoice(PolyBind.java:70) (via modules: com.google.inject.util.Modules$OverrideModule -> com.google.inject.util.Modules$OverrideModule -> io.druid.guice.LocalDataStorageDruidModule)
while locating io.druid.segment.loading.DataSegmentFinder
1 error
at com.google.inject.internal.InjectorImpl$2.get(InjectorImpl.java:1028)
at com.google.inject.internal.InjectorImpl.getInstance(InjectorImpl.java:1054)
at io.druid.cli.InsertSegment.run(InsertSegment.java:102)
at io.druid.cli.Main.main(Main.java:116)
The GCS extension does provide a DataSegmentFinder implementation, so that should work.
Unknown provider[google] of Key[type=io.druid.segment.loading.DataSegmentFinder, annotation=[none]], known options[[local]]
From this, it looks like the GCS extension is not getting loaded when the insert-segment-to-db tool runs, I would check that you’ve defined the extension in your extension load list and check that the druid config directories are in the classpath when you run the tool.
Hm, I wonder if the extension directory is missing any files. Since GCS is a contrib extension it’s not packaged by default, maybe you need to run pull-deps to grab the extension: