Historical node kerberos reauthentication failure - hdfs

Hi,

I am using the following properties in common runtime properties file to do Kerberos auth with hdfs. (on historical node)

Here are the entries from my common.runtime.properties file.

druid.hadoop.security.kerberos.principal=xxx

druid.hadoop.security.kerberos.keytab=xxxx.xxxx

Historical node is able to auth first time when I start it, but once the token expires it is not able to reauthenticate. I get the following error.

Hi,

I am using the following properties in common runtime properties file to do Kerberos auth with hdfs. (on historical node)

Here are the entries from my common.runtime.properties file.

druid.hadoop.security.kerberos.principal=xxx

druid.hadoop.security.kerberos.keytab=xxxx.xxxx

Historical node is able to auth first time when I start it, but once the token expires it is not able to reauthenticate. I get the following error.

In this case it seem to be a kerberos config, by default kdc does not allow you to renew the ticket.

Here is a an example on how to change it http://championofcyrodiil.blogspot.com/2014/01/kinit-ticket-expired-while-renewing.html

========

[ZkCoordinator-0] org.apache.hadoop.security.UserGroupInformation - PriviledgedActionException as: xxxx@xxxx.xx (auth:KERBEROS) cause:javax.secu

rity.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]

==============

I have tried the suggestion here but that does not seem to work.

https://github.com/druid-io/druid/issues/1588 (#2 suggested by @himasnshug).

This is my jvm.config

===========

-server

-Xms8g

-Xmx8g

-XX:MaxDirectMemorySize=6096m

-Duser.timezone=UTC

-Dfile.encoding=UTF-8

-Djava.io.tmpdir=var/tmp

This will not work for sure it is an internal setting to Yahoo folks.

Hi Slim,

Thanks for replying. Unfortunately, I do not have access to the kdc settings.

Will setting up a cron job with periodic kinit’s work?

Thanks

Ankur

Hi Slim,

Thanks for replying. Unfortunately, I do not have access to the kdc settings.

Will setting up a cron job with periodic kinit’s work?

That might work if the kinit doesn’t need to be interactive.

Hi Slim,

My kinit is not interactive, but that does not seem to work. Once historical start throwing auth exceptions I executed the kinit command, but it did not pick it up. Is there any other approach to solve this problem?

Thanks
Ankur

That seems weird to me.

are you doing the kinit as the user running the druid process

when you do kinit can you list the content of hdfs as the druid user ?

add the following params to the jvm to get more logs about what is used to negotiate with KDC server.

-Dsun.security.krb5.debug=true -Dsun.security.spnego.debug=true

My hdfs user and druid user are different. I am running the kinit for hdfs user using its keytab file while I am logged in as druid user.

I cannot list the contents as I do not have necessary hadoop libraries installed on the historical node. They are only installed as part of druid.

I enabled trace(KRB5_TRACE) on kinit and here are the results :

if the kinit is not done with the druid user, then the druid process can not read the /tmp/krb5ccXXXX

adding the debug java params will show you what items the druid process is using to authenticate and why it is rejected.