Exception while using Hadoop based batch ingestion


I’m trying to index data in Parquet format stored on HDFS in a Cloudera Hadoop cluster into a Druid data source.

I’m following the spec file and the indexing command in this link - http://druid.io/docs/latest/development/extensions-contrib/parquet.html

However, I’m getting the below exception and unable to proceed further. Can somebody help me with this issue ?

Caused by: com.fasterxml.jackson.databind.JsonMappingException: Instantiation of [simple type, class io.druid.indexer.HadoopDruidIndexerConfig] value failed: null (through reference chain: io.druid.indexer.HadoopDruidIndexerConfig[“spec”])

Thanks in advance !


Here’s is complete exception stack trace.