If you choose the key_file
way, my guess is that the key must be on the VPS/vagrant machine. So you might want to copy it first. Note that you need a private key here, not a public one.
For your second option, you could push your key to specific users depending on the instance type. Suppose the user in VPS is vpsuser, and that you deploy mostly on these VPS, you could do :
group_vars/all :
deploy_user=vpsuser
group_vars/vagrant
deploy_user=vagrant
Then, you could have a playbook like :
- name: send key to remote deploy user
copy: src=files/private_key dest=~/{{deploy_user}}/.ssh/priv_key
- name: read-write git checkout from github
git: repo={{ repository }} dest=/home/site key_file=~/{{deploy_user}}/.ssh/priv_key
However, I have no idea how the password for the remote private key might be asked (I don't think ansible allows authentication agent forwarding by default (check -vvvv
output), you might have to fiddle with your ~/.ansible.cfg
).
I suggest that you use a specific key for deployment purposes (with read-only perms on your git repository). This way, your private key won't leave your machine. Make this special key password-less. I think the security trade-off is acceptable since - it will just protect your code, - your code is checked out on the machine where the private key is so the game is already over.
Another option is to distribute your application from your local checkout using ansible : make a tarball, copy files over, untar, and you're set. This way, you don't need to leave security credentials on your VPS.
Good luck.