This is a discussion on Web servers and ssh servers behind load balancer - Kerberos ; Hi folks, We have a bunch of hosts that allow password-free ssh logins using kerberos. These also run web servers, which use mod_auth_kerb. We also have a BigIP load balancer that has a name; when people ssh or web access ...
We have a bunch of hosts that allow password-free ssh logins using kerberos.
These also run web servers, which use mod_auth_kerb.
We also have a BigIP load balancer that has a name; when people ssh or web
access that name, they get round-robin distributed across the cluster.
The LB supports Layer 3 and Layer 5 transparent proxying to the back end.
We have noticed that if people log into nodes with their real hostname,
or web access a url using the real hostname of the server, everything
works as expected.
However, attempting to ssh into the load balancer address typically gives:
debug1: Authentications that can continue:
debug1: Next authentication method: gssapi-with-mic
debug1: Delegating credentials
debug1: Miscellaneous failure
debug1: Trying to start again
And when users try to access the web server through the load balancer:
Authentication never succeeds and the following mod_auth_kerb error is logged:
failed to verify krb5 credentials: Server not found in Kerberos database
Logging into the machine through the ssh load balancer shows the IP
address of the loadbalancer,
not the IP address of the source ssh machine.
We did some attempts at putting server keys with the hostname of the
load balancer into the
srvtab on each of the servers, but never had any luck.
Any ideas? I did some low-level tcpdumping and tracing various parts
of the Kerberos code, and came up with
some bizarre results for why we are getting failures.