Documents for HadoopConverterTask

I tried to upgrade to druid 0.8 from 0.8 and convert old segments. But I can’t find any docs for HadoopConverterTask, and tried issue a task like this but failed:

curl -X POST ‘http://:/druid/indexer/v1/task’ -H ‘content-type: application/json’ -d ’
{
“type”: “hadoop_convert_segment”,
“dataSource”: “test_source”,
“force” : true,
“interval” : “2015-03-01T00:00:00+0800/2015-05-12T16:00:00+0800”
}

``

Error 500

HTTP ERROR: 500

Problem accessing /druid/indexer/v1/task. Reason:

    javax.servlet.ServletException: com.fasterxml.jackson.databind.JsonMappingException: Instantiation of [simple type, class io.druid.indexing.common.task.HadoopConverterTask] value failed: distributedSuccessCache

Powered by Jetty://

``

I checked the code HadoopConverterTask.java, and found those

this.distributedSuccessCache = Preconditions.checkNotNull(distributedSuccessCache, "distributedSuccessCache");
this.segmentOutputPath = Preconditions.checkNotNull(segmentOutputPath, "segmentOutputPath");

``

It makes me confused as I thought that those properties should be generated automatically if omitted.

HI Wei, the documentation for this task seem to be missing. That is our fault. I’ve contacted the original author to add documentation and respond to this thread.

Thanks FJ!

BTW, I would like to know that, will the historical node reload those upgraded segments after convert task? And can you explain some details about how does it work(maybe put some message in zk to notify coordinator, as I imagine)

Will add docs