Insert-segment-to-db: load metadata from Google Cloud Storage located segments

Hi everyone,

I have a Druid 0.12 setup up and running with a GCS deep storage.

I’m now trying to use insert-segment-to-db to load some old segments located in a bucket Google Cloud Storage.

-cp /lib/:/opt/druid/lib/:conf/druid/_common:/opt/druid/extensions/druid-google-extensions/:/opt/druid/extensions/druid-hdfs-storage/
io.druid.cli.Main tools insert-segment-to-db
–workingDir gs://path-to-segments


The directive above successfully updates our metadata but queries, for these segments time-period, still return nothing.

Querying the metadata storage I find out that:

  • bucket info seems to be ignored
  • loadSpec type is hdfs rather than google

Newly loaded segment:

“loadSpec”: {
“type”: “hdfs”,
“path”: gs://path-to-segments


Regular segment:

    "loadSpec": {
        "type": "google",
        "bucket": "bucket",
        "path": gs://path-to-segments


I feel that I’m using the hdfs extensions rather than google but if I specify google as a as a java arg I get the following error:

Exception in thread “main” Unable to provision, see the following errors:

  1. Unknown provider[google] of Key[type=io.druid.segment.loading.DataSegmentFinder, annotation=[none]], known options[[local]]

at io.druid.guice.PolyBind.createChoice( (via modules:$OverrideModule ->$OverrideModule -> io.druid.guice.LocalDataStorageDruidModule)

while locating io.druid.segment.loading.DataSegmentFinder

1 error




at io.druid.cli.Main.main(


Did you go through this process already?

Thank you in advance,

It seems that druid-google-extensions doesn’t provide an implementation of the DataSegmentFinder interface (only used by insert-segment-to-db), so that tool is not currently supported with GCS.

I opened an issue here:



Thank you Jonathan

I just make a pull request to add this to the Google Cloud Storage adapter: