Index data from deep storage


In the docs write this

Druid uses an inputSpec in the ioConfig to know where the data to be ingested is located and how to read it. For simple Hadoop batch ingestion, static or **granularity**spec types allow you to read data stored in deep storage.

I understand that druid can index data from deep storage that doesn’t exist yet in the cached segments.

I’ve written a task like

“granularitySpec”: {
“type”: “uniform”,
“segmentGranularity”: “DAY”,
“queryGranularity”: “DAY”,
“intervals”: [
“ioConfig”: {
“type” : “hadoop”,
“inputSpec” : {
“type” : “static”,
“paths” : “/home/richard/temp/user-us”


where paths point on data from deep storage it’s doesnt work

HELP Please


By data stored in deep storage, do you mean the druid segments which are stored in deep storage ?
If that is the case, docs for reIndexing existing druid segments can be found here -