incremental persist failed

hi,all
I find a error when I use the index-service to submit index_realtime stream .
there is my step :
1、json file
{
“type”:“index_realtime”,
“spec”:{
“dataSchema”:{
“dataSource”:“ts_sf_capture_his”,
“parser”:{
“type”:“string”,
“parseSpec”:{
“format”:“json”,
“timestampSpec”:{
“column”:“time”,
“format”:“millis”
},
“dimensionsSpec”:{
“dimensions”:[
“frameId”,
“taskHandle”,
“imgWidth”,
“imgHeight”,
“trackId”,
“pitch”,
“qualityScore”,
“yaw”,
“imgUrl”,
“taskSerial”,
“taskName”,
“taskFilter”,
“videoSerial”,
“videoName”,
“targetsSerial”,
“targetsName”
],
“dimensionExclusions”:[

                    ],
                    "spatialDimensions":[

                    ]
                }
            }
        },
        "metricsSpec":[
            {
                "name":"capture_counts",
                "type":"count"
            }
        ],
        "granularitySpec":{
            "type":"uniform",
            "segmentGranularity":"HOUR",
            "queryGranularity":"second"
        }
    },
    "ioConfig":{
        "type":"realtime",
        "firehose":{
            "type":"kafka-0.8",
            "consumerProps":{
                "zookeeper.connect":"XXXXXX:2181",
                "zookeeper.connection.timeout.ms":"30000",
                "zookeeper.session.timeout.ms":"30000",
                "zookeeper.sync.time.ms":"5000",
                "group.id":"ts_sf_capture_his",
                "fetch.message.max.bytes":"1048586",
                "auto.offset.reset":"largest",
                "auto.commit.enable":"false"
            },
            "feed":"st.sf_history"
        }
    },
    "tuningConfig":{
        "type":"realtime",
        "maxRowsInMemory":500000,
        "intermediatePersistPeriod":"PT10m",
        "windowPeriod":"PT10m",
        "basePersistDirectory":"\/home\/wangjing\/logs\/localpersist\/"
    }
}

}

2、submit json file
curl -X ‘POST’ -H ‘Content-Type:application/json’ -d @jsondir/sf_capture_his.json http://XXXXX:8090/druid/indexer/v1/task

But ! There is a mistake in my operation , the program run to RealtimePlumber.persistHydrant ,this is a mistake happened ,
log.makeAlert(“dataSource[%s] – incremental persist failed”, schema.getDataSource())
.addData(“interval”, interval)
.addData(“count”, indexToPersist.getCount())
.emit();

and the IOException is java.io.IOException : No such file or directory when I have remote debugging 。 and I find the basePersistDirectory that it’s not the right dir when I configure the json file。 The basePersistDirectory is other dir (var/druid/task/index_realtimeXXXX/work/persisit) that it is my middlemanage conf (druid.indexer.task.baseTaskDir=var/druid/task)

Auto Generated Inline Image 3.png

I feel very confused!

I know that there are two ways to submit , eg : tranquility and kafka-ingestion . But I want to know the cause of the error , thx!

I got it , thx !

在 2016年9月19日星期一 UTC+8下午1:13:46,Jerome liu写道: