문제

Jenkins keeps using the default "jenkins" user when executing builds. My build requires a number of SSH calls. However these SSH calls fails with Host verification exceptions because i haven't been able connect place the public key for this user on the target server.

I don't know where the default "jenkins" user is configured and therefore cant generate the required public key to place on the target server.

Any suggestions for either;

  1. A way to force Jenkins to use a user i define
  2. A way to enable SSH for the default Jenkins user
  3. Fetch the password for the default 'jenkins' user

Ideally I would like to be able do both both any help greatly appreciated.

Solution: I was able access the default Jenkins user with an SSH request from the target server. Once i was logged in as the jenkins user i was able generate the public/private RSA keys which then allowed for password free access between servers

도움이 되었습니까?

해결책 2

The default 'jenkins' user is the system user running your jenkins instance (master or slave). Depending on your installation this user can have been generated either by the install scripts (deb/rpm/pkg etc), or manually by your administrator. It may or may not be called 'jenkins'.

To find out under what user your jenkins instance is running, open the http://$JENKINS_SERVER/systemInfo, available from your Manage Jenkins menu.

There you will find your user.home and user.name. E.g. in my case on a Mac OS X master:

user.home   /Users/Shared/Jenkins/Home/
user.name   jenkins

Once you have that information you will need to log onto that jenkins server as the user running jenkins and ssh into those remote servers to accept the ssh fingerprints.

An alternative (that I've never tried) would be to use a custom jenkins job to accept those fingerprints by for example running the following command in a SSH build task:

ssh -o "StrictHostKeyChecking no" your_remote_server

This last tip is of course completely unacceptable from a pure security point of view :)

다른 팁

Because when having numerous slave machine it could be hard to anticipate on which of them build will be executed, rather then explicitly calling ssh I highly suggest using existing Jenkins plug-ins for SSH executing a remote commands:

  • Publish Over SSH - execute SSH commands or transfer files over SCP/SFTP.
  • SSH - execute SSH commands.

So one might make a "job" which writes the host keys as a constant, like:

echo "....." > ~/.ssh/known_hosts

just fill the dots from ssh-keyscan -t rsa {ip}, after you verify it.

That's correct, pipeline jobs will normally use the user jenkins, which means that SSH access needs to be given for this account for it work in the pipeline jobs. People have all sorts of complex build environments so it seems like a fair requirement.

As stated in one of the answers, each individual configuration could be different, so check under "System Information" or similar, in "Manage Jenkins" on the web UI. There should be a user.home and a user.name for the home directory and the username respectively. On my CentOS installation these are "/var/lib/jenkins/" and "jenkins".

The first thing to do is to get a shell access as user jenkins in our case. Because this is an auto-generated service account, the shell is not enabled by default. Assuming you can log in as root or preferably some other user (in which case you'll need to prepend sudo) switch to jenkins as follows:

su -s /bin/bash jenkins

Now you can verify that it's really jenkins and that you entered the right home directory:

whoami
echo $HOME

If these don't match what you see in the configuration, do not proceed.

All is good so far, let's check what keys we already have:

ls -lah ~/.ssh

There may only be keys created with the hostname. See if you can use them:

ssh-copy-id user@host_ip_address

If there's an error, you may need to generate new keys:

ssh-keygen

Accept the default values, and no passphrase, if it prompts you to add the new keys to the home directory, without overwriting existing ones. Now you can run ssh-copy-id again.

It's a good idea to test it with something like

ssh user@host_ip_address ls

If it works, so should ssh, scp, rsync etc. in the Jenkins jobs. Otherwise, check the console output to see the error messages and try those exact commands on the shell as done above.

라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 StackOverflow
scroll top