Hi - I’m trying to submit a hadoop index task to Overlord and get it to run on a remote Hadoop cluster. Currently I’m doing this all locally in Docker containers as a PoC . So all of the Druid nodes run in separate Docker containers, and Hadoop (HDFS and Yarn) also run in a Docker container .
I can submit the task  to the Overlord, and it seems to successfully get the data from HDFS, index it and create a segment .
However there are two problems:
the job is not run on the remote hadoop cluster; it uses LocalJobRunner
the job is unable to write the segment to HDFS: a “Pathname … is not a valid DFS filename” exception gets thrown 
I must not be setting things up properly, but can’t find anything else to try. I would hugely appreciate any pointers in the right direction.