Druid - Hadoop Remote Exception

I am running real-time node that is reading csv files using csv firehose. I am using HDFS as my deep storage authenticating via kerberos (i must have not having enough documentation in this area killed me :-))

When I run the real-time node, I sudo as “eng” and run the realtime node.

During runtime, while persisting the segments to HDFS, I see the following error -

org.apache.hadoop.ipc.RemoteException: User: eng/eng@UNIX.company.COM is not allowed to impersonate eng

at org.apache.hadoop.ipc.Client.call(Client.java:1406) ~[?:?]

at org.apache.hadoop.ipc.Client.call(Client.java:1359) ~[?:?]

at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206) ~[?:?]

at com.sun.proxy.$Proxy100.mkdirs(Unknown Source) ~[?:?]

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.7.0_79]

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) ~[?:1.7.0_79]

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.7.0_79]

at java.lang.reflect.Method.invoke(Method.java:606) ~[?:1.7.0_79]

at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186) ~[?:?]

at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[?:?]

at com.sun.proxy.$Proxy100.mkdirs(Unknown Source) ~[?:?]

at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:487) ~[?:?]

at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2532) ~[?:?]

at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2503) ~[?:?]

at org.apache.hadoop.hdfs.DistributedFileSystem$16.doCall(DistributedFileSystem.java:823) ~[?:?]

at org.apache.hadoop.hdfs.DistributedFileSystem$16.doCall(DistributedFileSystem.java:819) ~[?:?]

at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) ~[?:?]

at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:819) ~[?:?]

at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:812) ~[?:?]

at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1816) ~[?:?]

at io.druid.storage.hdfs.HdfsDataSegmentPusher.push(HdfsDataSegmentPusher.java:87) ~[?:?]

at io.druid.segment.realtime.plumber.RealtimePlumber$4.doRun(RealtimePlumber.java:550) [druid-server-0.9.0.jar:0.9.0]

at io.druid.common.guava.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:42) [druid-common-0.9.0.jar:0.9.0]

at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) [?:1.7.0_79]

at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) [?:1.7.0_79]

at java.lang.Thread.run(Thread.java:745) [?:1.7.0_79]

Note: eng/eng@UNIX.company.COM is my kerberos principal

Is there a setting in druid to disable impersonatiion? What is the process to bring a realtime node byt including the kerberos key file?

Any advice?