Kerberos HTTP service Using GSS shows No valid credentials due to domain name or host name mismatch

113 Views Asked by At

I am having a Micro-Service Platform having multiple Micro-Services connected to each other, Platform uses Kerberos for authentication of Micro-Services. In One of Micro-Service Node hadoop is installed which uses separate KDC for Hadoop cluster authentication.

Lets say platform domain is "idm.com" and hadoop domain is "hadoop.com".

Resource Manager is running on one node. I have configure HTTP principal for spnego in core-site.xml using "hadoop.http.authentication.kerberos.principal" property to "HTTP/[email protected]" and nodes Hostname is "hadoopmaster.idm.com".

I do Kinit and acquire root user ticket from TGS. When I tried to do curl using "curl -k -v --negotiate -u : https://master.hadoop.com:8090/cluster" It shows GSS Exception: No valid credentials provided.

If I see klist it shows two ticket one krbtgt and second "HTTP/[email protected]"(I have added this principal in kdc database). First krbtgt i got using kinit and second HTTP one i Got it automatically after doing curl before curl the ticket was not there. Krb client acquired another for using HTTP service.

After some debugging I noticed the problem/behaviour is I got ticket for HTTP/[email protected] where I have configure hadoop to use HTTP/[email protected]. If we configure hadoop to use "HTTP/[email protected]" then ui is accessible.

I have added both FQDNs to /etc/hosts file. It seems when I do curl using any of the FQDNs I got the HTTP ticket of the first entry in /etc/hosts file. For example if

...
10.7.0.5 hadoopmaster.idm.com
10.7.0.5 master.hadoop.com
...

now if i do curl i will get HTTP/[email protected] in klist. and if /etc/hosts looks like this

...
10.7.0.5 master.hadoop.com
10.7.0.5 hadoopmaster.idm.com
...

Now if i do curl i will get HTTP/master.hadoop.com in klist In both the cases if i configure the hadoop property to the same i got using curl then UI will be accessible and other wise it will shows 403 GSSException which i guess means curl used spnego but didn't get valid credentials. And if it matches with the hadoop's configured principal then it will work.

It looks like Hostname is causing problem is there any way to map this hostname or is there any kerberos config which can map this or any property which will give me exact ticket with exact hostname i have specified in curl despite of hadoop configurations.

1

There are 1 best solutions below

0
uds0128 On

Answering my own question for future readers, first read this https://stackoverflow.com/questions/75427514/web-interface-login-apache-hadoop-cluster-with-kerberos where i have explained how kerberos works when we want to access UI.

The problem here was when i try to access UI i type hostname hadoopmaster.idm.com in browser and ip will be resolver from that hostname which is 172.7.0.5 using local dns configs in windows C:\Windows\System32\drivers\etc\hosts and in linux /etc/hosts then reverse dns lookup was happening which will resolve hostname from resolved ip 172.7.0.5 and it will scan /etc/hosts file and will pick up first hostname which was master.hadoop.com so here in my case i had multiple hostname so reverse dns lookup was resulting in different hostname master.hadoop.com , other then actual hostname needed for service or written in browser or curl hadoopmaster.idm.com on which service was serving and resultantly kdc will be requested for seesion ticket with other hostname master.hadoop.com and i was getting 403.

Resolved the problem by adding property rdns = false in kerberos client config file. rdns = Reverse DNS Lookup