Hadoop index task succeed, but no datasources found

Hi
I’m new to Druid, after done batch load from hdfs following the quickstart guide. Query returns empty result.

My steps:

  1. upload wikiticker-2015-09-12-sampled.json to hdfs, path: /tmp/wikiticker-2015-09-12-sampled.json

  2. modify wikiticker-index.json, change paths to /tmp/wikiticker-2015-09-12-sampled.json

  3. run hadoop_import: curl -X ‘POST’ -H ‘Content-Type:application/json’ -d @wikiticker-index.json localhost:8090/druid/indexer/v1/task

  4. log shows that index job is succeed:

2018-01-26T15:14:48,058 INFO [Curator-PathChildrenCache-1] io.druid.indexing.overlord.TaskQueue - Received SUCCESS status for task: index_hadoop_wikiticker_2018-01-26T07:13:18.529Z

2018-01-26T15:14:48,063 INFO [Curator-PathChildrenCache-1] io.druid.indexing.overlord.TaskLockbox - TaskLock is now empty: TaskLock{groupId=index_hadoop_wikiticker_2018-01-26T07:13:18.529Z, dataSource=wikiticker, interval=2015-09-12T00:00:00.000Z/2015-09-13T00:00:00.000Z, version=2018-01-26T07:13:18.534Z}

2018-01-26T15:14:48,072 INFO [Curator-PathChildrenCache-1] io.druid.indexing.overlord.TaskRunnerUtils - Task [index_hadoop_wikiticker_2018-01-26T07:13:18.529Z] status changed to [SUCCESS].

  1. query returns noting with wikiticker-top-pages.json

  2. check broker segment meta, and returns empty

{

“queryType”:“segmentMetadata”,

“dataSource”:“wikiticker”,

“intervals”:[“2015-09-12T00:00:00.000Z/2015-09-13T00:00:00.000Z”]

}

but following quick start streaming ingest with tranquility-distribution, datasource “metrics” succeed created, and query is as expected

在 2018年1月26日星期五 UTC+8下午4:06:10,郭冬冬写道:

Hi :

Can you post your ingestion spec here ? More specifically, could the path in the ingestion be “hdfs://:8020/tmp/wikiticker-2015-09-12-sampled.json” instead ?

Thanks