Question

I'm looking to automate retrieval of files from a bunch of servers via SSH. The problem is the servers are on separate networks and I need to go through 2 intermediary servers and these servers provide limited privileges ,limited ssh and no netcat or connect.

Env -

localhost --> GW1 --> GW2 --> (server1, server2,server3)

GW1,GW2 and even some servers have restricted shells ..

Currently the method being used is , A dynamic tunnel is created to GW1 using putty, GW1 has a config stored to connect to GW2 ..ssh to GW 2. Configure filezilla to use the tunnel , connect to server X , download a file via sftp

Any way it can be automated ? Even just the file transfer part be automated using the tunnel to connect and transfer would be helpful. I vaguely recall succeeding to retrieve files via scp using the tunnel but can't remember how I managed it.

No correct solution

OTHER TIPS

This link goes to an article on transparent multihop ssh. It gives a solution that works for me in a similar situation. I just tested this going from MacOS X through a linux box to another linux box and it worked. Your mileage may vary with other environments.

Using the names you gave above, put this in $HOME/.ssh/config.

Host GW2
ProxyCommand ssh -q GW1 nc GW2 22

Host server1
ProxyCommand ssh -q GW2 nc server1 22

I've used scp over this type of setup and it works fine. The referenced article gives more details, but the proxy command directive gives it a pre-configured way to access that host that's transparent to the end user.

One thing to note, the article specifies using nc -q0, but that option isn't available on the stock nc on my ubuntu VM or my mac. It seems to work fine without it and a few minutes of searching left me no closer to figuring out what -q0 is supposed to do.

If anyone knows and wants to include it here, I'll happily update my answer.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top