[druid-user] sharing deep storage

Hi,

I have druid cluster using s3 as deep storage, I have hundreds of datasources created.
If I want to start a second cluster, can I point to the same s3 deep storage and configure to load into segment cache? I hope to avoid having to re-load the hundreds of datasources.

Thanks,

Hey DJ,
You didn’t specify what is the purpose of sharing the deep storage.

Assuming the second cluster is only used for reading from the deep storage and the metadata store, you should be fine (just make sure you give read-only permissions on S3 and RDS to that second cluster).

BTW, we used a similar method to what I described above, to create a Dev cluster that’s a replica (in terms of data) of the Prod cluster, but in a “read-only” mode (see this slide deck from Virtual Druid Summit, slide 55).

Good luck :slight_smile:
Itai