Connecting remote HDFS to Druid

Hello,
I have been working on ingesting batch data from HDFS to Apache Druid. My hadoop cluster and Druid Hadoop client are installed on different servers. I followed the steps in Druid doc and did the below changes:

  1. Copied all *.xml files from Hadoop cluster to a folder “hadoop-xml” within
    /conf/druid/single-server/micro-quickstart/_common.
  2. Updated my
    common.runtime.properties as below:
    #druid.storage.type=local

#druid.storage.storageDirectory=var/druid/segments

For HDFS:

druid.storage.type=hdfs

druid.storage.storageDirectory=/druid/segments

comment out “verify bin/verify-default-ports” in
micro-quickstart.conf

Then when i start Druid i am getting below errors in almost all log files.
Example from router.log
[sjoshi122@ip-1-71-62-6 sv]$ less router.log
2020-02-06T19:18:18,968 INFO [main] org.hibernate.validator.internal.util.Version - HV000001: Hibernate Validator 5.2.5.Final
2020-02-06T19:18:19,882 INFO [main] org.apache.druid.initialization.Initialization - Loading extension [druid-hdfs-storage], jars: hadoop-mapreduce-client-core-2.8.5.jar, hadoop-yarn-api-2.8.5.jar, commons-configuration-1.6.jar, apacheds-i18n-2.0.0-M15.jar, hadoop-common-2.8.5.jar, jetty-sslengine-6.1.26.jar, hadoop-client-2.8.5.jar, curator-framework-4.1.0.jar, htrace-core4-4.0.1-incubating.jar, commons-digester-1.8.jar, jcip-annotations-1.0-1.jar, xmlenc-0.52.jar, hadoop-mapreduce-client-app-2.8.5.jar, json-smart-2.3.jar, hadoop-auth-2.8.5.jar, asm-7.1.jar, jackson-core-asl-1.9.13.jar, jsp-api-2.1.jar, hadoop-yarn-client-2.8.5.jar, api-util-1.0.3.jar, commons-collections-3.2.2.jar, api-asn1-api-1.0.0-M20.jar, apacheds-kerberos-codec-2.0.0-M15.jar, hadoop-yarn-server-common-2.8.5.jar, hadoop-annotations-2.8.5.jar, hadoop-mapreduce-client-jobclient-2.8.5.jar, hadoop-hdfs-client-2.8.5.jar, curator-recipes-4.1.0.jar, accessors-smart-1.2.jar, gson-2.2.4.jar, leveldbjni-all-1.8.jar, commons-net-3.6.jar, jackson-mapper-asl-1.9.13.jar, hadoop-mapreduce-client-common-2.8.5.jar, hadoop-mapreduce-client-shuffle-2.8.5.jar, nimbus-jose-jwt-4.41.1.jar, druid-hdfs-storage-0.17.0.jar
2020-02-06T19:18:19,902 INFO [main] org.apache.druid.initialization.Initialization - Loading extension [druid-kafka-indexing-service], jars: druid-kafka-indexing-service-0.17.0.jar, snappy-java-1.1.7.2.jar, zstd-jni-1.3.3-1.jar, lz4-java-1.6.0.jar, kafka-clients-2.2.1.jar
2020-02-06T19:18:19,927 INFO [main] org.apache.druid.initialization.Initialization - Loading extension [druid-datasketches], jars: druid-datasketches-0.17.0.jar, commons-math3-3.6.1.jar
2020-02-06T19:18:20,309 INFO [main] org.apache.druid.initialization.Initialization - Loading extension [druid-hdfs-storage], jars: hadoop-mapreduce-client-core-2.8.5.jar, hadoop-yarn-api-2.8.5.jar, commons-configuration-1.6.jar, apacheds-i18n-2.0.0-M15.jar, hadoop-common-2.8.5.jar, jetty-sslengine-6.1.26.jar, hadoop-client-2.8.5.jar, curator-framework-4.1.0.jar, htrace-core4-4.0.1-incubating.jar, commons-digester-1.8.jar, jcip-annotations-1.0-1.jar, xmlenc-0.52.jar, hadoop-mapreduce-client-app-2.8.5.jar, json-smart-2.3.jar, hadoop-auth-2.8.5.jar, asm-7.1.jar, jackson-core-asl-1.9.13.jar, jsp-api-2.1.jar, hadoop-yarn-client-2.8.5.jar, api-util-1.0.3.jar, commons-collections-3.2.2.jar, api-asn1-api-1.0.0-M20.jar, apacheds-kerberos-codec-2.0.0-M15.jar, hadoop-yarn-server-common-2.8.5.jar, hadoop-annotations-2.8.5.jar, hadoop-mapreduce-client-jobclient-2.8.5.jar, hadoop-hdfs-client-2.8.5.jar, curator-recipes-4.1.0.jar, accessors-smart-1.2.jar, gson-2.2.4.jar, leveldbjni-all-1.8.jar, commons-net-3.6.jar, jackson-mapper-asl-1.9.13.jar, hadoop-mapreduce-client-common-2.8.5.jar, hadoop-mapreduce-client-shuffle-2.8.5.jar, nimbus-jose-jwt-4.41.1.jar, druid-hdfs-storage-0.17.0.jar
2020-02-06T19:18:20,315 INFO [main] org.apache.druid.initialization.Initialization - Loading extension [druid-kafka-indexing-service], jars: druid-kafka-indexing-service-0.17.0.jar, snappy-java-1.1.7.2.jar, zstd-jni-1.3.3-1.jar, lz4-java-1.6.0.jar, kafka-clients-2.2.1.jar
2020-02-06T19:18:20,319 INFO [main] org.apache.druid.initialization.Initialization - Loading extension [druid-datasketches], jars: druid-datasketches-0.17.0.jar, commons-math3-3.6.1.jar
2020-02-06T19:18:22,796 INFO [main] com.google.inject.Guice - An exception was caught and reported. Message: KrbException: Cannot locate default realm
java.lang.IllegalArgumentException: Can’t get Kerberos realm

at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:65) ~[?:?]
at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:296) ~[?:?]
at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:281) ~[?:?]
at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:837) ~[?:?]
at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:807) ~[?:?]
at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:680) ~[?:?]
at org.apache.hadoop.fs.FileSystem$Cache$Key.(FileSystem.java:2978) ~[?:?]
at org.apache.hadoop.fs.FileSystem$Cache$Key.(FileSystem.java:2968) ~[?:?]
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2830) ~[?:?]
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:389) ~[?:?]
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:181) ~[?:?]
at org.apache.druid.storage.hdfs.HdfsStorageDruidModule.configure(HdfsStorageDruidModule.java:95) ~[?:?]
at com.google.inject.spi.Elements$RecordingBinder.install(Elements.java:340) ~[guice-4.1.0.jar:?]
at com.google.inject.spi.Elements.getElements(Elements.java:110) ~[guice-4.1.0.jar:?]
at com.google.inject.util.Modules$OverrideModule.configure(Modules.java:198) ~[guice-4.1.0.jar:?]
at com.google.inject.AbstractModule.configure(AbstractModule.java:62) ~[guice-4.1.0.jar:?]
at com.google.inject.spi.Elements$RecordingBinder.install(Elements.java:340) ~[guice-4.1.0.jar:?]
at com.google.inject.spi.Elements.getElements(Elements.java:110) ~[guice-4.1.0.jar:?]
at com.google.inject.internal.InjectorShell$Builder.build(InjectorShell.java:138) [guice-4.1.0.jar:?]
at com.google.inject.internal.InternalInjectorCreator.build(InternalInjectorCreator.java:104) [guice-4.1.0.jar:?]
at com.google.inject.Guice.createInjector(Guice.java:99) [guice-4.1.0.jar:?]
at com.google.inject.Guice.createInjector(Guice.java:73) [guice-4.1.0.jar:?]
at com.google.inject.Guice.createInjector(Guice.java:62) [guice-4.1.0.jar:?]
at org.apache.druid.initialization.Initialization.makeInjectorWithModules(Initialization.java:431) [druid-server-0.17.0.jar:0.17.0]
at org.apache.druid.cli.GuiceRunnable.makeInjector(GuiceRunnable.java:69) [druid-services-0.17.0.jar:0.17.0]
at org.apache.druid.cli.ServerRunnable.run(ServerRunnable.java:58) [druid-services-0.17.0.jar:0.17.0]
at org.apache.druid.cli.Main.main(Main.java:113) [druid-services-0.17.0.jar:0.17.0]
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_242]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_242]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_242]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_242]
at org.apache.hadoop.security.authentication.util.KerberosUtil.getDefaultRealm(KerberosUtil.java:110) ~[?:?]
at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:63) ~[?:?]
… 26 more
Caused by: sun.security.krb5.KrbException: Cannot locate default realm
at sun.security.krb5.Config.getDefaultRealm(Config.java:1134) ~[?:1.8.0_242]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_242]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_242]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_242]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_242]
at org.apache.hadoop.security.authentication.util.KerberosUtil.getDefaultRealm(KerberosUtil.java:110) ~[?:?]
at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:63) ~[?:?]
… 26 more
Exception in thread “main” java.lang.RuntimeException: com.google.inject.CreationException: Unable to create injector, see the following errors:

  1. No implementation for org.apache.hadoop.conf.Configuration annotated with @org.apache.druid.guice.Hdfs() was bound.
    while locating org.apache.hadoop.conf.Configuration annotated with @org.apache.druid.guice.Hdfs()
    for the 1st parameter of org.apache.druid.storage.hdfs.HdfsDataSegmentKiller.(HdfsDataSegmentKiller.java:48)
    at org.apache.druid.storage.hdfs.HdfsStorageDruidModule.configure(HdfsStorageDruidModule.java:83) (via modules: com.google.inject.util.Modules$OverrideModule -> org.apache.druid.storage.hdfs.HdfsStorageDruidModule)
  2. No implementation for org.apache.hadoop.conf.Configuration annotated with @org.apache.druid.guice.Hdfs() was bound.
    while locating org.apache.hadoop.conf.Configuration annotated with @org.apache.druid.guice.Hdfs()
    for the 2nd parameter of org.apache.druid.storage.hdfs.HdfsDataSegmentPusher.(HdfsDataSegmentPusher.java:69)
    at org.apache.druid.storage.hdfs.HdfsStorageDruidModule.configure(HdfsStorageDruidModule.java:82) (via modules: com.google.inject.util.Modules$OverrideModule -> org.apache.druid.storage.hdfs.HdfsStorageDruidModule)
  3. No implementation for org.apache.hadoop.conf.Configuration annotated with @org.apache.druid.guice.Hdfs() was bound.
    while locating org.apache.hadoop.conf.Configuration annotated with @org.apache.druid.guice.Hdfs()
    for the 1st parameter of org.apache.druid.storage.hdfs.HdfsFileTimestampVersionFinder.(HdfsFileTimestampVersionFinder.java:42)
    at org.apache.druid.storage.hdfs.HdfsStorageDruidModule.configure(HdfsStorageDruidModule.java:78) (via modules: com.google.inject.util.Modules$OverrideModule -> org.apache.druid.storage.hdfs.HdfsStorageDruidModule)
  4. An exception was caught and reported. Message: Can’t get Kerberos realm
    at com.google.inject.util.Modules$OverrideModule.configure(Modules.java:198)
    I then copied
    krb5.conf from hadoop clusters /etc folder to Druid hadoop clients /etc folder.
    But then I get the below error:

2020-02-06T19:45:57,973 INFO [main] org.hibernate.validator.internal.util.Version - HV000001: Hibernate Validator 5.2.5.Final

2020-02-06T19:45:58,906 INFO [main] org.apache.druid.initialization.Initialization - Loading extension [druid-hdfs-storage], jars: hadoop-mapreduce-client-core-2.8.5.jar, hadoop-yarn-api-2.8.5.jar, commons-configuration-1.6.jar, apacheds-i18n-2.0.0-M15.jar, hadoop-common-2.8.5.jar, jetty-sslengine-6.1.26.jar, hadoop-client-2.8.5.jar, curator-framework-4.1.0.jar, htrace-core4-4.0.1-incubating.jar, commons-digester-1.8.jar, jcip-annotations-1.0-1.jar, xmlenc-0.52.jar, hadoop-mapreduce-client-app-2.8.5.jar, json-smart-2.3.jar, hadoop-auth-2.8.5.jar, asm-7.1.jar, jackson-core-asl-1.9.13.jar, jsp-api-2.1.jar, hadoop-yarn-client-2.8.5.jar, api-util-1.0.3.jar, commons-collections-3.2.2.jar, api-asn1-api-1.0.0-M20.jar, apacheds-kerberos-codec-2.0.0-M15.jar, hadoop-yarn-server-common-2.8.5.jar, hadoop-annotations-2.8.5.jar, hadoop-mapreduce-client-jobclient-2.8.5.jar, hadoop-hdfs-client-2.8.5.jar, curator-recipes-4.1.0.jar, accessors-smart-1.2.jar, gson-2.2.4.jar, leveldbjni-all-1.8.jar, commons-net-3.6.jar, jackson-mapper-asl-1.9.13.jar, hadoop-mapreduce-client-common-2.8.5.jar, hadoop-mapreduce-client-shuffle-2.8.5.jar, nimbus-jose-jwt-4.41.1.jar, druid-hdfs-storage-0.17.0.jar

2020-02-06T19:45:58,928 INFO [main] org.apache.druid.initialization.Initialization - Loading extension [druid-kafka-indexing-service], jars: druid-kafka-indexing-service-0.17.0.jar, snappy-java-1.1.7.2.jar, zstd-jni-1.3.3-1.jar, lz4-java-1.6.0.jar, kafka-clients-2.2.1.jar

2020-02-06T19:45:58,938 INFO [main] org.apache.druid.initialization.Initialization - Loading extension [druid-datasketches], jars: druid-datasketches-0.17.0.jar, commons-math3-3.6.1.jar

2020-02-06T19:45:59,232 INFO [main] org.apache.druid.initialization.Initialization - Loading extension [druid-hdfs-storage], jars: hadoop-mapreduce-client-core-2.8.5.jar, hadoop-yarn-api-2.8.5.jar, commons-configuration-1.6.jar, apacheds-i18n-2.0.0-M15.jar, hadoop-common-2.8.5.jar, jetty-sslengine-6.1.26.jar, hadoop-client-2.8.5.jar, curator-framework-4.1.0.jar, htrace-core4-4.0.1-incubating.jar, commons-digester-1.8.jar, jcip-annotations-1.0-1.jar, xmlenc-0.52.jar, hadoop-mapreduce-client-app-2.8.5.jar, json-smart-2.3.jar, hadoop-auth-2.8.5.jar, asm-7.1.jar, jackson-core-asl-1.9.13.jar, jsp-api-2.1.jar, hadoop-yarn-client-2.8.5.jar, api-util-1.0.3.jar, commons-collections-3.2.2.jar, api-asn1-api-1.0.0-M20.jar, apacheds-kerberos-codec-2.0.0-M15.jar, hadoop-yarn-server-common-2.8.5.jar, hadoop-annotations-2.8.5.jar, hadoop-mapreduce-client-jobclient-2.8.5.jar, hadoop-hdfs-client-2.8.5.jar, curator-recipes-4.1.0.jar, accessors-smart-1.2.jar, gson-2.2.4.jar, leveldbjni-all-1.8.jar, commons-net-3.6.jar, jackson-mapper-asl-1.9.13.jar, hadoop-mapreduce-client-common-2.8.5.jar, hadoop-mapreduce-client-shuffle-2.8.5.jar, nimbus-jose-jwt-4.41.1.jar, druid-hdfs-storage-0.17.0.jar

2020-02-06T19:45:59,235 INFO [main] org.apache.druid.initialization.Initialization - Loading extension [druid-kafka-indexing-service], jars: druid-kafka-indexing-service-0.17.0.jar, snappy-java-1.1.7.2.jar, zstd-jni-1.3.3-1.jar, lz4-java-1.6.0.jar, kafka-clients-2.2.1.jar

2020-02-06T19:45:59,251 INFO [main] org.apache.druid.initialization.Initialization - Loading extension [druid-datasketches], jars: druid-datasketches-0.17.0.jar, commons-math3-3.6.1.jar

2020-02-06T19:46:01,825 INFO [main] com.google.inject.Guice - An exception was caught and reported. Message: java.io.FileNotFoundException: /etc/hadoop/conf/ldap-conn-pass.txt (No such file or directory)

java.lang.RuntimeException: Could not read password file: /etc/hadoop/conf/ldap-conn-pass.txt

at org.apache.hadoop.security.LdapGroupsMapping.extractPassword(LdapGroupsMapping.java:501) ~[?:?]

at org.apache.hadoop.security.LdapGroupsMapping.setConf(LdapGroupsMapping.java:399) ~[?:?]

at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76) ~[?:?]

at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136) ~[?:?]

at org.apache.hadoop.security.Groups.(Groups.java:106) ~[?:?]

at org.apache.hadoop.security.Groups.(Groups.java:102) ~[?:?]

at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:450) ~[?:?]

at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:314) ~[?:?]

at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:281) ~[?:?]

at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:837) ~[?:?]

at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:807) ~[?:?]

at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:680) ~[?:?]

at org.apache.hadoop.fs.FileSystem$Cache$Key.(FileSystem.java:2978) ~[?:?]

at org.apache.hadoop.fs.FileSystem$Cache$Key.(FileSystem.java:2968) ~[?:?]

at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2830) ~[?:?]

at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:389) ~[?:?]

at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:181) ~[?:?]

at org.apache.druid.storage.hdfs.HdfsStorageDruidModule.configure(HdfsStorageDruidModule.java:95) ~[?:?]

at com.google.inject.spi.Elements$RecordingBinder.install(Elements.java:340) ~[guice-4.1.0.jar:?]

at com.google.inject.spi.Elements.getElements(Elements.java:110) ~[guice-4.1.0.jar:?]

at com.google.inject.util.Modules$OverrideModule.configure(Modules.java:198) ~[guice-4.1.0.jar:?]

at com.google.inject.AbstractModule.configure(AbstractModule.java:62) ~[guice-4.1.0.jar:?]

at com.google.inject.spi.Elements$RecordingBinder.install(Elements.java:340) ~[guice-4.1.0.jar:?]

at com.google.inject.spi.Elements.getElements(Elements.java:110) ~[guice-4.1.0.jar:?]

at com.google.inject.internal.InjectorShell$Builder.build(InjectorShell.java:138) [guice-4.1.0.jar:?]

at com.google.inject.internal.InternalInjectorCreator.build(InternalInjectorCreator.java:104) [guice-4.1.0.jar:?]

at com.google.inject.Guice.createInjector(Guice.java:99) [guice-4.1.0.jar:?]

at com.google.inject.Guice.createInjector(Guice.java:73) [guice-4.1.0.jar:?]

at com.google.inject.Guice.createInjector(Guice.java:62) [guice-4.1.0.jar:?]

at org.apache.druid.initialization.Initialization.makeInjectorWithModules(Initialization.java:431) [druid-server-0.17.0.jar:0.17.0]

at org.apache.druid.cli.GuiceRunnable.makeInjector(GuiceRunnable.java:69) [druid-services-0.17.0.jar:0.17.0]

at org.apache.druid.cli.ServerRunnable.run(ServerRunnable.java:58) [druid-services-0.17.0.jar:0.17.0]

at org.apache.druid.cli.Main.main(Main.java:113) [druid-services-0.17.0.jar:0.17.0]

Caused by: java.io.FileNotFoundException: /etc/hadoop/conf/ldap-conn-pass.txt (No such file or directory)

at java.io.FileInputStream.open0(Native Method) ~[?:1.8.0_242]

at java.io.FileInputStream.open(FileInputStream.java:195) ~[?:1.8.0_242]

at java.io.FileInputStream.(FileInputStream.java:138) ~[?:1.8.0_242]

at java.io.FileInputStream.(FileInputStream.java:93) ~[?:1.8.0_242]

at org.apache.hadoop.security.LdapGroupsMapping.extractPassword(LdapGroupsMapping.java:492) ~[?:?]

… 32 more

Exception in thread “main” java.lang.RuntimeException: com.google.inject.CreationException: Unable to create injector, see the following errors:

  1. No implementation for org.apache.hadoop.conf.Configuration annotated with @org.apache.druid.guice.Hdfs() was bound.

while locating org.apache.hadoop.conf.Configuration annotated with @org.apache.druid.guice.Hdfs()

for the 1st parameter of org.apache.druid.storage.hdfs.HdfsDataSegmentKiller.(HdfsDataSegmentKiller.java:48)

at org.apache.druid.storage.hdfs.HdfsStorageDruidModule.configure(HdfsStorageDruidModule.java:83) (via modules: com.google.inject.util.Modules$OverrideModule -> org.apache.druid.storage.hdfs.HdfsStorageDruidModule)

  1. No implementation for org.apache.hadoop.conf.Configuration annotated with @org.apache.druid.guice.Hdfs() was bound.

while locating org.apache.hadoop.conf.Configuration annotated with @org.apache.druid.guice.Hdfs()

for the 2nd parameter of org.apache.druid.storage.hdfs.HdfsDataSegmentPusher.(HdfsDataSegmentPusher.java:69)

at org.apache.druid.storage.hdfs.HdfsStorageDruidModule.configure(HdfsStorageDruidModule.java:82) (via modules: com.google.inject.util.Modules$OverrideModule -> org.apache.druid.storage.hdfs.HdfsStorageDruidModule)

  1. No implementation for org.apache.hadoop.conf.Configuration annotated with @org.apache.druid.guice.Hdfs() was bound.

while locating org.apache.hadoop.conf.Configuration annotated with @org.apache.druid.guice.Hdfs()

for the 1st parameter of org.apache.druid.storage.hdfs.HdfsFileTimestampVersionFinder.(HdfsFileTimestampVersionFinder.java:42)

at org.apache.druid.storage.hdfs.HdfsStorageDruidModule.configure(HdfsStorageDruidModule.java:78) (via modules: com.google.inject.util.Modules$OverrideModule -> org.apache.druid.storage.hdfs.HdfsStorageDruidModule)

  1. An exception was caught and reported. Message: Could not read password file: /etc/hadoop/conf/ldap-conn-pass.txt

at com.google.inject.util.Modules$OverrideModule.configure(Modules.java:198)

Could someone please help or provide the steps on how to connect my remote hadoop cluster with druid?
Hadoop version is 2.7.3.2.6.3.120-1
Thanks!