Can we used insert-segment-to-db Tool with deep storage as google cloud storage

can we used insert-segment-to-db Tool with deep storage as google cloud storage.
i want to migrate segments to google cloud storage and i am going to use google cloud storage as deep storage. but it seems insert-segment-to-db Tool works only with hdfs ,S3 and local.

i am getting this error when i am trying with google cloud storage.

Unknown provider[google] of Key[type=io.druid.segment.loading.DataSegmentFinder, annotation=[none]], known options[[local]]
  at io.druid.guice.PolyBind.createChoice(PolyBind.java:70) (via modules: com.google.inject.util.Modules$OverrideModule -> com.google.inject.util.Modules$OverrideModule -> io.druid.guice.LocalDataStorageDruidModule)
  while locating io.druid.segment.loading.DataSegmentFinder

1 error
        at com.google.inject.internal.InjectorImpl$2.get(InjectorImpl.java:1028)
        at com.google.inject.internal.InjectorImpl.getInstance(InjectorImpl.java:1054)
        at io.druid.cli.InsertSegment.run(InsertSegment.java:102)
        at io.druid.cli.Main.main(Main.java:116)

The GCS extension does provide a DataSegmentFinder implementation, so that should work.

Unknown provider[google] of Key[type=io.druid.segment.loading.DataSegmentFinder, annotation=[none]], known options[[local]]

From this, it looks like the GCS extension is not getting loaded when the insert-segment-to-db tool runs, I would check that you’ve defined the extension in your extension load list and check that the druid config directories are in the classpath when you run the tool.

Thanks,

Jon

Hi Jonathan Wei,
i am adding the GCS extension while running this job here is the config.

DRUIDVERSION=“0.12.3” # Druid version

DRUIDBASEDIR="/home/username/druid-${DRUIDVERSION}"

HADOOPVERSION=“2.9.0” # Match this with the version of Hadoop on DataProc

cd ${DRUIDBASEDIR} && java -Ddruid.metadata.storage.type=mysql -Ddruid.metadata.storage.connector.connectURI=jdbc:mysql://ip-adress:3306/druid2 -Ddruid.metadata.storage.connector.user=druid2 -Ddruid.metadata.storage.connector.password=passwd -Ddruid.extensions.loadList=[“mysql-metadata-storage”,“druid-google-extensions”] -Ddruid.storage.type=google -Ddruid.google.bucket=druid-migration -Ddruid.google.prefix=druid/segments/ -cp “lib/*” io.druid.cli.Main tools insert-segment-to-db --workingDir “druid/segments/wikipedia” --updateDescriptor true

please let me know if i need to add anything else.

Hm, I wonder if the extension directory is missing any files. Since GCS is a contrib extension it’s not packaged by default, maybe you need to run pull-deps to grab the extension:

http://druid.io/docs/latest/operations/pull-deps.html

e.g.,

java -cp “lib/*” -Ddruid.extensions.directory=“extensions-tmp” -Ddruid.extensions.hadoopDependenciesDir=“hadoop-dependencies-tmp” io.druid.cli.Main tools pull-deps -c io.druid.extensions.contrib:druid-google-extensions:0.12.3