Class org.apache.hadoop.yarn.client.RequestHedgingRMFailoverProxyProvider not found

Hello,

I am trying to reindex segments using hadoop indexing service and i am getting following errors on overlord.

```2017-07-10T08:22:18,227 DEBUG [task-runner-0-priority-0] org.apache.hadoop.service.AbstractService - noteFailure java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.hadoop.yarn.client.RequestHedgingRMFailoverProxyProvider not found
2017-07-10T08:22:18,227 INFO [task-runner-0-priority-0] org.apache.hadoop.service.AbstractService - Service org.apache.hadoop.yarn.client.api.impl.YarnClientImpl failed in state STARTED; cause: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.hadoop.yarn.client.RequestHedgingRMFailoverProxyProvider not found
java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.hadoop.yarn.client.RequestHedgingRMFailoverProxyProvider not found
	at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2227) ~[hadoop-common-2.7.3.jar:?]
	at org.apache.hadoop.yarn.client.RMProxy.createRMFailoverProxyProvider(RMProxy.java:161) ~[hadoop-yarn-common-2.7.3.jar:?]
	at org.apache.hadoop.yarn.client.RMProxy.createRMProxy(RMProxy.java:94) ~[hadoop-yarn-common-2.7.3.jar:?]
	at org.apache.hadoop.yarn.client.ClientRMProxy.createRMProxy(ClientRMProxy.java:72) ~[hadoop-yarn-common-2.7.3.jar:?]
	at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceStart(YarnClientImpl.java:187) ~[hadoop-yarn-client-2.7.3.jar:?]
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193) [hadoop-common-2.7.3.jar:?]
	at org.apache.hadoop.mapred.ResourceMgrDelegate.serviceStart(ResourceMgrDelegate.java:108) [hadoop-mapreduce-client-jobclient-2.7.3.jar:?]
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193) [hadoop-common-2.7.3.jar:?]
	at org.apache.hadoop.mapred.ResourceMgrDelegate.<init>(ResourceMgrDelegate.java:97) [hadoop-mapreduce-client-jobclient-2.7.3.jar:?]
	at org.apache.hadoop.mapred.YARNRunner.<init>(YARNRunner.java:112) [hadoop-mapreduce-client-jobclient-2.7.3.jar:?]
	at org.apache.hadoop.mapred.YarnClientProtocolProvider.create(YarnClientProtocolProvider.java:34) [hadoop-mapreduce-client-jobclient-2.7.3.jar:?]
	at org.apache.hadoop.mapreduce.Cluster.initialize(Cluster.java:95) [hadoop-mapreduce-client-core-2.7.3.jar:?]
	at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:82) [hadoop-mapreduce-client-core-2.7.3.jar:?]
	at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:75) [hadoop-mapreduce-client-core-2.7.3.jar:?]
	at org.apache.hadoop.mapreduce.Job$9.run(Job.java:1260) [hadoop-mapreduce-client-core-2.7.3.jar:?]
	at org.apache.hadoop.mapreduce.Job$9.run(Job.java:1256) [hadoop-mapreduce-client-core-2.7.3.jar:?]
	at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_131]
	at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_131]
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698) [hadoop-common-2.7.3.jar:?]
	at org.apache.hadoop.mapreduce.Job.connect(Job.java:1256) [hadoop-mapreduce-client-core-2.7.3.jar:?]
	at org.apache.hadoop.mapreduce.Job.submit(Job.java:1284) [hadoop-mapreduce-client-core-2.7.3.jar:?]
	at io.druid.indexer.IndexGeneratorJob.run(IndexGeneratorJob.java:203) [druid-indexing-hadoop-0.10.0.jar:0.10.0]
	at io.druid.indexer.JobHelper.runJobs(JobHelper.java:349) [druid-indexing-hadoop-0.10.0.jar:0.10.0]
	at io.druid.indexer.HadoopDruidIndexerJob.run(HadoopDruidIndexerJob.java:95) [druid-indexing-hadoop-0.10.0.jar:0.10.0]
	at io.druid.indexing.common.task.HadoopIndexTask$HadoopIndexGeneratorInnerProcessing.runTask(HadoopIndexTask.java:276) [druid-indexing-service-0.10.0.jar:0.10.0]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_131]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_131]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_131]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_131]
	at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:208) [druid-indexing-service-0.10.0.jar:0.10.0]
	at io.druid.indexing.common.task.HadoopIndexTask.run(HadoopIndexTask.java:223) [druid-indexing-service-0.10.0.jar:0.10.0]
	at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:436) [druid-indexing-service-0.10.0.jar:0.10.0]
	at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:408) [druid-indexing-service-0.10.0.jar:0.10.0]
	at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_131]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_131]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_131]
	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_131]
Caused by: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.hadoop.yarn.client.RequestHedgingRMFailoverProxyProvider not found
	at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2195) ~[hadoop-common-2.7.3.jar:?]
	at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2219) ~[hadoop-common-2.7.3.jar:?]
	... 36 more
Caused by: java.lang.ClassNotFoundException: Class org.apache.hadoop.yarn.client.RequestHedgingRMFailoverProxyProvider not found
	at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2101) ~[hadoop-common-2.7.3.jar:?]
	at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2193) ~[hadoop-common-2.7.3.jar:?]
	at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2219) ~[hadoop-common-2.7.3.jar:?]
	... 36 more
2017-07-10T08:22:18,228 DEBUG [task-runner-0-priority-0] org.apache.hadoop.service.AbstractService - Service: org.apache.hadoop.yarn.client.api.impl.YarnClientImpl entered state STOPPED
2017-07-10T08:22:18,228 DEBUG [task-runner-0-priority-0] org.apache.hadoop.service.AbstractService - Service: org.apache.hadoop.yarn.client.api.impl.TimelineClientImpl entered state STOPPED
2017-07-10T08:22:18,228 DEBUG [task-runner-0-priority-0] org.apache.hadoop.service.AbstractService - noteFailure java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.hadoop.yarn.client.RequestHedgingRMFailoverProxyProvider not found
2017-07-10T08:22:18,228 INFO [task-runner-0-priority-0] org.apache.hadoop.service.AbstractService - Service org.apache.hadoop.mapred.ResourceMgrDelegate failed in state STARTED; cause: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.hadoop.yarn.client.RequestHedgingRMFailoverProxyProvider not found
java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.hadoop.yarn.client.RequestHedgingRMFailoverProxyProvider not found
	at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2227) ~[hadoop-common-2.7.3.jar:?]
	at org.apache.hadoop.yarn.client.RMProxy.createRMFailoverProxyProvider(RMProxy.java:161) ~[hadoop-yarn-common-2.7.3.jar:?]
	at org.apache.hadoop.yarn.client.RMProxy.createRMProxy(RMProxy.java:94) ~[hadoop-yarn-common-2.7.3.jar:?]
	at org.apache.hadoop.yarn.client.ClientRMProxy.createRMProxy(ClientRMProxy.java:72) ~[hadoop-yarn-common-2.7.3.jar:?]
	at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceStart(YarnClientImpl.java:187) ~[hadoop-yarn-client-2.7.3.jar:?]
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193) [hadoop-common-2.7.3.jar:?]
	at org.apache.hadoop.mapred.ResourceMgrDelegate.serviceStart(ResourceMgrDelegate.java:108) ~[hadoop-mapreduce-client-jobclient-2.7.3.jar:?]
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193) [hadoop-common-2.7.3.jar:?]
	at org.apache.hadoop.mapred.ResourceMgrDelegate.<init>(ResourceMgrDelegate.java:97) [hadoop-mapreduce-client-jobclient-2.7.3.jar:?]
	at org.apache.hadoop.mapred.YARNRunner.<init>(YARNRunner.java:112) [hadoop-mapreduce-client-jobclient-2.7.3.jar:?]
	at org.apache.hadoop.mapred.YarnClientProtocolProvider.create(YarnClientProtocolProvider.java:34) [hadoop-mapreduce-client-jobclient-2.7.3.jar:?]
	at org.apache.hadoop.mapreduce.Cluster.initialize(Cluster.java:95) [hadoop-mapreduce-client-core-2.7.3.jar:?]
	at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:82) [hadoop-mapreduce-client-core-2.7.3.jar:?]
	at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:75) [hadoop-mapreduce-client-core-2.7.3.jar:?]
	at org.apache.hadoop.mapreduce.Job$9.run(Job.java:1260) [hadoop-mapreduce-client-core-2.7.3.jar:?]
	at org.apache.hadoop.mapreduce.Job$9.run(Job.java:1256) [hadoop-mapreduce-client-core-2.7.3.jar:?]
	at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_131]
	at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_131]
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698) [hadoop-common-2.7.3.jar:?]
	at org.apache.hadoop.mapreduce.Job.connect(Job.java:1256) [hadoop-mapreduce-client-core-2.7.3.jar:?]
	at org.apache.hadoop.mapreduce.Job.submit(Job.java:1284) [hadoop-mapreduce-client-core-2.7.3.jar:?]
	at io.druid.indexer.IndexGeneratorJob.run(IndexGeneratorJob.java:203) [druid-indexing-hadoop-0.10.0.jar:0.10.0]
	at io.druid.indexer.JobHelper.runJobs(JobHelper.java:349) [druid-indexing-hadoop-0.10.0.jar:0.10.0]
	at io.druid.indexer.HadoopDruidIndexerJob.run(HadoopDruidIndexerJob.java:95) [druid-indexing-hadoop-0.10.0.jar:0.10.0]
	at io.druid.indexing.common.task.HadoopIndexTask$HadoopIndexGeneratorInnerProcessing.runTask(HadoopIndexTask.java:276) [druid-indexing-service-0.10.0.jar:0.10.0]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_131]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_131]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_131]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_131]
	at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:208) [druid-indexing-service-0.10.0.jar:0.10.0]
	at io.druid.indexing.common.task.HadoopIndexTask.run(HadoopIndexTask.java:223) [druid-indexing-service-0.10.0.jar:0.10.0]
	at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:436) [druid-indexing-service-0.10.0.jar:0.10.0]
	at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:408) [druid-indexing-service-0.10.0.jar:0.10.0]
	at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_131]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_131]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_131]
	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_131]
Caused by: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.hadoop.yarn.client.RequestHedgingRMFailoverProxyProvider not found
	at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2195) ~[hadoop-common-2.7.3.jar:?]
	at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2219) ~[hadoop-common-2.7.3.jar:?]
	... 36 more
Caused by: java.lang.ClassNotFoundException: Class org.apache.hadoop.yarn.client.RequestHedgingRMFailoverProxyProvider not found
	at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2101) ~[hadoop-common-2.7.3.jar:?]
	at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2193) ~[hadoop-common-2.7.3.jar:?]
	at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2219) ~[hadoop-common-2.7.3.jar:?]
	... 36 more
2017-07-10T08:22:18,229 DEBUG [task-runner-0-priority-0] org.apache.hadoop.service.AbstractService - Service: org.apache.hadoop.mapred.ResourceMgrDelegate entered state STOPPED
2017-07-10T08:22:18,229 INFO [task-runner-0-priority-0] org.apache.hadoop.mapreduce.Cluster - Failed to use org.apache.hadoop.mapred.YarnClientProtocolProvider due to error:
java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.hadoop.yarn.client.RequestHedgingRMFailoverProxyProvider not found
	at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2227) ~[hadoop-common-2.7.3.jar:?]
	at org.apache.hadoop.yarn.client.RMProxy.createRMFailoverProxyProvider(RMProxy.java:161) ~[hadoop-yarn-common-2.7.3.jar:?]
	at org.apache.hadoop.yarn.client.RMProxy.createRMProxy(RMProxy.java:94) ~[hadoop-yarn-common-2.7.3.jar:?]
	at org.apache.hadoop.yarn.client.ClientRMProxy.createRMProxy(ClientRMProxy.java:72) ~[hadoop-yarn-common-2.7.3.jar:?]
	at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceStart(YarnClientImpl.java:187) ~[hadoop-yarn-client-2.7.3.jar:?]
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193) ~[hadoop-common-2.7.3.jar:?]
	at org.apache.hadoop.mapred.ResourceMgrDelegate.serviceStart(ResourceMgrDelegate.java:108) ~[hadoop-mapreduce-client-jobclient-2.7.3.jar:?]
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193) ~[hadoop-common-2.7.3.jar:?]
	at org.apache.hadoop.mapred.ResourceMgrDelegate.<init>(ResourceMgrDelegate.java:97) ~[hadoop-mapreduce-client-jobclient-2.7.3.jar:?]
	at org.apache.hadoop.mapred.YARNRunner.<init>(YARNRunner.java:112) ~[hadoop-mapreduce-client-jobclient-2.7.3.jar:?]
	at org.apache.hadoop.mapred.YarnClientProtocolProvider.create(YarnClientProtocolProvider.java:34) ~[hadoop-mapreduce-client-jobclient-2.7.3.jar:?]
	at org.apache.hadoop.mapreduce.Cluster.initialize(Cluster.java:95) [hadoop-mapreduce-client-core-2.7.3.jar:?]
	at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:82) [hadoop-mapreduce-client-core-2.7.3.jar:?]
	at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:75) [hadoop-mapreduce-client-core-2.7.3.jar:?]
	at org.apache.hadoop.mapreduce.Job$9.run(Job.java:1260) [hadoop-mapreduce-client-core-2.7.3.jar:?]
	at org.apache.hadoop.mapreduce.Job$9.run(Job.java:1256) [hadoop-mapreduce-client-core-2.7.3.jar:?]
	at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_131]
	at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_131]
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698) [hadoop-common-2.7.3.jar:?]
	at org.apache.hadoop.mapreduce.Job.connect(Job.java:1256) [hadoop-mapreduce-client-core-2.7.3.jar:?]
	at org.apache.hadoop.mapreduce.Job.submit(Job.java:1284) [hadoop-mapreduce-client-core-2.7.3.jar:?]
	at io.druid.indexer.IndexGeneratorJob.run(IndexGeneratorJob.java:203) [druid-indexing-hadoop-0.10.0.jar:0.10.0]
	at io.druid.indexer.JobHelper.runJobs(JobHelper.java:349) [druid-indexing-hadoop-0.10.0.jar:0.10.0]
	at io.druid.indexer.HadoopDruidIndexerJob.run(HadoopDruidIndexerJob.java:95) [druid-indexing-hadoop-0.10.0.jar:0.10.0]
	at io.druid.indexing.common.task.HadoopIndexTask$HadoopIndexGeneratorInnerProcessing.runTask(HadoopIndexTask.java:276) [druid-indexing-service-0.10.0.jar:0.10.0]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_131]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_131]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_131]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_131]
	at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:208) [druid-indexing-service-0.10.0.jar:0.10.0]
	at io.druid.indexing.common.task.HadoopIndexTask.run(HadoopIndexTask.java:223) [druid-indexing-service-0.10.0.jar:0.10.0]
	at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:436) [druid-indexing-service-0.10.0.jar:0.10.0]
	at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:408) [druid-indexing-service-0.10.0.jar:0.10.0]
	at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_131]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_131]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_131]
	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_131]
Caused by: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.hadoop.yarn.client.RequestHedgingRMFailoverProxyProvider not found
	at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2195) ~[hadoop-common-2.7.3.jar:?]
	at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2219) ~[hadoop-common-2.7.3.jar:?]
	... 36 more
Caused by: java.lang.ClassNotFoundException: Class org.apache.hadoop.yarn.client.RequestHedgingRMFailoverProxyProvider not found
	at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2101) ~[hadoop-common-2.7.3.jar:?]
	at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2193) ~[hadoop-common-2.7.3.jar:?]
	at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2219) ~[hadoop-common-2.7.3.jar:?]
	... 36 more
2017-07-10T08:22:18,230 DEBUG [task-runner-0-priority-0] org.apache.hadoop.mapreduce.Cluster - Trying ClientProtocolProvider : org.apache.hadoop.mapred.LocalClientProtocolProvider
2017-07-10T08:22:18,230 DEBUG [task-runner-0-priority-0] org.apache.hadoop.mapreduce.Cluster - Cannot pick org.apache.hadoop.mapred.LocalClientProtocolProvider as the ClientProtocolProvider - returned null protocol
2017-07-10T08:22:18,230 DEBUG [task-runner-0-priority-0] org.apache.hadoop.security.UserGroupInformation - PrivilegedActionException as:root (auth:SIMPLE) cause:java.io.IOException: Cannot initialize Cluster. Please check your configuration for mapreduce.framework.name and the correspond server addresses.
2017-07-10T08:22:18,232 ERROR [task-runner-0-priority-0] io.druid.indexing.overlord.ThreadPoolTaskRunner - Exception while running task[HadoopIndexTask{id=index_hadoop_111-AnalyticsData-v1_2017-07-10T08:22:09.831Z, type=index_hadoop, dataSource=111-AnalyticsData-v1}]
java.lang.RuntimeException: java.lang.reflect.InvocationTargetException
	at com.google.common.base.Throwables.propagate(Throwables.java:160) ~[guava-16.0.1.jar:?]
	at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:211) ~[druid-indexing-service-0.10.0.jar:0.10.0]
	at io.druid.indexing.common.task.HadoopIndexTask.run(HadoopIndexTask.java:223) ~[druid-indexing-service-0.10.0.jar:0.10.0]
	at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:436) [druid-indexing-service-0.10.0.jar:0.10.0]
	at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:408) [druid-indexing-service-0.10.0.jar:0.10.0]
	at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_131]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_131]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_131]
	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_131]
Caused by: java.lang.reflect.InvocationTargetException
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_131]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_131]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_131]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_131]
	at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:208) ~[druid-indexing-service-0.10.0.jar:0.10.0]
	... 7 more
Caused by: java.lang.RuntimeException: java.io.IOException: Cannot initialize Cluster. Please check your configuration for mapreduce.framework.name and the correspond server addresses.
	at io.druid.indexer.IndexGeneratorJob.run(IndexGeneratorJob.java:215) ~[druid-indexing-hadoop-0.10.0.jar:0.10.0]
	at io.druid.indexer.JobHelper.runJobs(JobHelper.java:349) ~[druid-indexing-hadoop-0.10.0.jar:0.10.0]
	at io.druid.indexer.HadoopDruidIndexerJob.run(HadoopDruidIndexerJob.java:95) ~[druid-indexing-hadoop-0.10.0.jar:0.10.0]
	at io.druid.indexing.common.task.HadoopIndexTask$HadoopIndexGeneratorInnerProcessing.runTask(HadoopIndexTask.java:276) ~[druid-indexing-service-0.10.0.jar:0.10.0]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_131]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_131]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_131]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_131]
	at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:208) ~[druid-indexing-service-0.10.0.jar:0.10.0]
	... 7 more
Caused by: java.io.IOException: Cannot initialize Cluster. Please check your configuration for mapreduce.framework.name and the correspond server addresses.
	at org.apache.hadoop.mapreduce.Cluster.initialize(Cluster.java:120) ~[?:?]
	at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:82) ~[?:?]
	at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:75) ~[?:?]
	at org.apache.hadoop.mapreduce.Job$9.run(Job.java:1260) ~[?:?]
	at org.apache.hadoop.mapreduce.Job$9.run(Job.java:1256) ~[?:?]
	at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_131]
	at javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_131]
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698) ~[?:?]
	at org.apache.hadoop.mapreduce.Job.connect(Job.java:1256) ~[?:?]
	at org.apache.hadoop.mapreduce.Job.submit(Job.java:1284) ~[?:?]
	at io.druid.indexer.IndexGeneratorJob.run(IndexGeneratorJob.java:203) ~[druid-indexing-hadoop-0.10.0.jar:0.10.0]
	at io.druid.indexer.JobHelper.runJobs(JobHelper.java:349) ~[druid-indexing-hadoop-0.10.0.jar:0.10.0]
	at io.druid.indexer.HadoopDruidIndexerJob.run(HadoopDruidIndexerJob.java:95) ~[druid-indexing-hadoop-0.10.0.jar:0.10.0]
	at io.druid.indexing.common.task.HadoopIndexTask$HadoopIndexGeneratorInnerProcessing.runTask(HadoopIndexTask.java:276) ~[druid-indexing-service-0.10.0.jar:0.10.0]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_131]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_131]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_131]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_131]
	at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:208) ~[druid-indexing-service-0.10.0.jar:0.10.0]
	... 7 more
2017-07-10T08:22:18,242 INFO [task-runner-0-priority-0] io.druid.indexing.overlord.TaskRunnerUtils - Task [index_hadoop_111-AnalyticsData-v1_2017-07-10T08:22:09.831Z] status changed to [FAILED].
2017-07-10T08:22:18,245 INFO [task-runner-0-priority-0] io.druid.indexing.worker.executor.ExecutorLifecycle - Task completed with status: {
  "id" : "index_hadoop_111-AnalyticsData-v1_2017-07-10T08:22:09.831Z",
  "status" : "FAILED",
  "duration" : 4265
}

Hadoop Version - 2.7.3

Druid Version - 0.10.0

Also i have tried multiple hadoop clients 2.6.0,2.7.3,2.7.1,2.3.0

Hadoop installed through ambari, Druid through docker containers

Config used:

  "type": "index_hadoop",
  "spec": {
    "dataSchema": {
      "dataSource": "ABC",
      "parser": {
        "type": "hadoopString",
        "parseSpec": {
          "format": "json",
          "timestampSpec": {
            "column": "timestamp",
            "format": "auto"
          },
          "flattenSpec": {
            "useFieldDiscovery": true

          },
          "dimensionsSpec": {

          }
        }
      },
      "metricsSpec": [
        {
          "type": "count",
          "name": "count"
        }
      ],
      "granularitySpec": {
        "type": "uniform",
        "segmentGranularity": "hour",
        "queryGranularity": "hour",
        "intervals" : [ "2017-07-04T06:00:00.000Z/2017-07-04T07:00:00.000Z" ]
      }
    },
    "ioConfig": {
      "type": "hadoop",
      "inputSpec": {
        "type": "static",
        "paths": "s3n://path here"
      }
    },
    "tuningConfig": {
      "type": "hadoop",
      "jobProperties":{
	      "mapreduce.job.user.classpath.first": true,
	      "mapreduce.job.classloader.system.classes": "-javax.validation.,java.,javax.,org.apache.commons.logging.,org.apache.log4j.,org.apache.hadoop.",
	      "fs.s3.awsAccessKeyId" : "<Access key>",
          "fs.s3.awsSecretAccessKey" : "<Secret Access key>",
          "fs.s3n.impl" : "org.apache.hadoop.fs.s3native.NativeS3FileSystem",
"io.compression.codecs" : "org.apache.hadoop.io.compress.GzipCodec,org.apache.hadoop.io.compress.DefaultCodec,org.apache.hadoop.io.compress.BZip2Codec,org.apache.hadoop.io.compress.SnappyCodec"
      }
    }
  },
  "hadoopDependencyCoordinates": ["org.apache.hadoop:hadoop-client:2.6.0"]
}```

How can i resolve this ?