Here’s the versions am working on :
Druid : druid-0.11.0-SNAPSHOT
Hadoop : Cloudera CDH 5.9.0 , Hadoop Client - 2.6.0-cdh5.9.0
I have a CSV file in HDFS and am trying to load into druid . It is working when my deep storage is local filesystem , but fails when I change my deep storage to hdfs. I had to change deep storage from local filesystem to hdfs , because mapreduce job to write just 10 mb of data from hdfs to local filesystem was taking too long .
Attached are my json task file , and other properties file .Other runtime property files are unchanged .
I start the druid nodes as per basic command given in quickstart .
Please help me fix this issue , as depending on this we need to take a call if we will be able to use druid or not in production.
common.runtime.properties.txt (4.06 KB)
middleManager_runtime.properties.txt (962 Bytes)
hadoop_job_error_log.txt (34.2 KB)
druid_indexing_console_error_log.txt (4.16 KB)
hdfsToDruid.json.txt (2.48 KB)