Druid simple file ingest show status successful but fails loading

I am fairly new to the Big data world and am having a very difficult time getting to load a very simple file. I have attached the spec file, data file, and the error log. I am running on the Horton 2.6.5 sandbox VM, and can not get past this file ingestion. Any help would be appreciated.

DruidErrorLog.txt (66.9 KB)

v1res_dq_spec.json.old3 (1.12 KB)

1rowOfData.json (54 Bytes)

Hi -

This part of the task log shows that it didn’t find any data to publish:

2018-08-24T06:14:35,624 INFO [publish-0] io.druid.segment.realtime.appenderator.AppenderatorDriver - Nothing to publish, skipping publish step.

2018-08-24T06:14:35,625 INFO [task-runner-0-priority-0] io.druid.indexing.common.task.IndexTask - Published segments

This is likely because there is no timestamp in the data file (Druid partitions everything based on timestamp). Try adding a timestamp that matches what’s in your ingestion spec:

“timestampSpec” : { “column” : “timestamp”, “format” : “iso” },

“intervals” : [ “2018-01-01/2018-12-31” ]




I added a timestamp to no avail. Attaching the most recent file with error log. These are local system files, and are created and run under my login, and not sure if that is an issue or not. Thank you for your help!

start_v1res_dq.sh (109 Bytes)

DruidErrorLog.txt (66.9 KB)

v1res_dq.json3.json (235 Bytes)

v1res_dq_spec.json.json (1.24 KB)


It looks like the json is not valid in v1res_dq.json3. You can try adding reportParseExceptions: true to your tuningConfig:


which will fail the task and tell you where the error is.