Question

I'm brand new to shell scripting and have been searching for examples on how to create a backup script for my website but I'm unable find something or at least something I understand.

I have a Synology Diskstation server that I'd like to use to automatically (through its scheduler) take backups of my website.

I currently am doing this via Automator on my Mac in conjunction with the Transmit FTP program, but making this a command line process is where I struggle.

This is what I'm looking to do in a script:

1) Open a URL without a browser (this URL creates a mysql dump of the databases on the server to be downloaded later). example url would be http://mywebsite.com/dump.php

2) Use FTP to download all files from the server. (Currently Transmit FTP handles this as a sync function and only downloads files where the remote file date is newer than the local file. It also will remove any local files that don't exist on the remote server).

3) Create a compressed archive of the files from step 2, named as website_CURRENT-DATE

4) Move archive from step 3 to a specific folder and delete any file in this specific folder that's older than 120 Days.

Right now I don't know how to do step 1, or the synchronization in step 2 (I see how I can use wget to download the whole site, but that seems as though it will download everything each time it runs, even if its not been changed).

Steps 3 and 4 are probably easy to find via searching, but I haven't searched for that yet since I can't get past step 1.

Thanks!

Also FYI my web-host doesn't do these types of backups, so that's why I like to do my own.

Était-ce utile?

La solution

Answering each of your questions in order, then:

  1. Several options, the most common of which would be one of wget http://mywebsite.com/dump.php or curl http://mywebsite.com/dump.php.

  2. Since you have ssh access to the server, you can very easily use rsync to grab a snapshot of the files on-disk with e. g. rsync -essh --delete --stats -zav username@mywebsite.com:/path/to/files/ /path/to/local/backup.

  3. Once you have the snapshot from rsync, you can make a compressed, dated copy with cd /path/to/local/backup; tar cvf /path/to/archives/website-$(date +%Y-%m-%d).tgz *

  4. find /path/to/archives -mtime +120 -type f -exec rm -f '{}' \; will remove all backups older than 120 days.

Licencié sous: CC-BY-SA avec attribution
Non affilié à StackOverflow
scroll top