HadoopIndex not working Unable to resolve artifacts for [org.apache.hadoop:hadoop-client:jar:2.3.0 (

Hi,

I am following the indexer tutorial to ingest batch data.
http://druid.io/docs/latest/ingestion/batch-ingestion.html

java -Xmx256m -Duser.timezone=UTC -Dfile.encoding=UTF-8 -classpath lib/:/usr/hdp/2.3.2.0-2950/hadoop/conf/ io.druid.cli.Main index hadoop hadoop-indexer.json
Jan 21, 2016 9:05:07 PM org.hibernate.validator.internal.util.Version
INFO: HV000001: Hibernate Validator 5.1.3.Final
2016-01-21T21:05:08,059 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.guice.ExtensionsConfig] from props[druid.extensions.] as [ExtensionsConfig{searchCurrentClassloader=true, coordinates=, defaultVersion=‘0.8.2’, localRepository=’/root/.m2/repository’, remoteRepositories=[https://repo1.maven.org/maven2/, https://metamx.artifactoryonline.com/metamx/pub-libs-releases-local]}]

``

The local repository it is picking up is not the one I mentioned in my config file which is the extension-repo folder. While running broker or any other agent it correctly picks up the local repo. Is there some place else I need to change the configs?

The problem I am facing is while running the hadoop index I below exception. Although I put the
2016-01-21T21:06:30,136 ERROR [main] io.druid.initialization.Initialization - Unable to resolve artifacts for [org.apache.hadoop:hadoop-client:jar:2.3.0 (runtime) -> < [ (https://repo1.maven.org/maven2/, releases+snapshots), (https://metamx.artifactoryonline.com/metamx/pub-libs-releases-local, releases+snapshots)]].
java.lang.NullPointerException
at org.eclipse.aether.internal.impl.DefaultRepositorySystem.resolveDependencies(DefaultRepositorySystem.java:361) ~[aether-impl-0.9.0.M2.jar:?]
at io.tesla.aether.internal.DefaultTeslaAether.resolveArtifacts(DefaultTeslaAether.java:289) ~[tesla-aether-0.0.5.jar:0.0.5]
at io.druid.initialization.Initialization.getClassLoaderForCoordinates(Initialization.java:253) [druid-server-0.8.2.jar:0.8.2]
at io.druid.cli.CliHadoopIndexer.run(CliHadoopIndexer.java:96) [druid-services-0.8.2.jar:0.8.2]
at io.druid.cli.Main.main(Main.java:91) [druid-services-0.8.2.jar:0.8.2]
2016-01-21T21:06:30,142 ERROR [main] io.druid.cli.CliHadoopIndexer - failure!!!
java.lang.NullPointerException
at org.eclipse.aether.internal.impl.DefaultRepositorySystem.resolveDependencies(DefaultRepositorySystem.java:361) ~[aether-impl-0.9.0.M2.jar:?]
at io.tesla.aether.internal.DefaultTeslaAether.resolveArtifacts(DefaultTeslaAether.java:289) ~[tesla-aether-0.0.5.jar:0.0.5]
at io.druid.initialization.Initialization.getClassLoaderForCoordinates(Initialization.java:253) ~[druid-server-0.8.2.jar:0.8.2]
at io.druid.cli.CliHadoopIndexer.run(CliHadoopIndexer.java:96) [druid-services-0.8.2.jar:0.8.2]
at io.druid.cli.Main.main(Main.java:91) [druid-services-0.8.2.jar:0.8.2]

``

I tried downloading hadoop-client2.3.jar and put in local /root/.m2/repository but still it is not able to recognize the jar. What am I doing wrong.

Thanks
@

Resolved the error. It was not having common configuration folder in the classpath. Please correct the tutorials. It is not picking up the common configuration by default.
java -Xmx256m -Duser.timezone=UTC -Dfile.encoding=UTF-8 -classpath config/_common:lib/:/usr/hdp/2.3.2.0-2950/hadoop/conf/ io.druid.cli.Main index hadoop hadoop-indexer.json

``