A super-nice to do Git deployment over FTP is git-ftp
. Unfortunately, it's one way, i.e. it assumes the destination files are only ever updated by you (and via git-ftp
).
So for your case you'd have to create some sort of synthetic history of remote changes when they occur, and the only way to do that is to get the files and then manually create "the delta" between them. So, for this, I'm with @VonC — you'd have to transfer the files back (rsync
does this efficienly, if it's not available, google for an utility which is able to crawl the remote hierarchy using the FTP protocol itself and fetch only the missing/more recent files) and then make the changes be known to Git.
This process may be either done on a separate branch or on a temporary throw-away branch. Basically you fork a branch off a point you think represented the state of remote files last time you looked at them and then run a syncronization program over the checkout. (Note that such a program have to track removals, that is, it has to remove files locally which were removed at the remote side.) Then run git add --all .
to make Git stage for commit all updated files, all removals and also add all the presently untracked files. Then record a commit. This commit will actually represent the changes made to the remote. You can then base your further work on this state.