Domanda

My goal is to run a daily backup (overriding the previous day's backup) of specific portions of my database so I could easily download it and import it and have all my important data if anything goes wrong.

I currently have a 20GB web server (Ubuntu) and my database is ~11GB and growing (slowly), so I know I'll need to fire up a second web server to store the backup. (And I'll eventually need to upgrade my primary server once it becomes ~20GB.)

My data is currently set up in a few indexed tables, but I don't need to back up all the data, so I'd like to run a query that selects only what I need and rebuilds a new database (.sql). This would help keep the size down, but the file is still going to be very large, so I'd also like to compress this file, would GZIP be the way to go? This would also neatly package the entire database into one file, which is something I need.

In addition, since I'll probably be using the second server to request the data from the first server, how do I ensure that the request doesn't time out?

TL;DR Need to run a daily back up of an enormous (10+ GB) database onto another server while removing certain tables/columns in the process and compressing to optimize hard disk & bandwidth usage so I can easily download & import the backup (one file) if need be.

È stato utile?

Soluzione

Mysqldump can output selected tables, and you can pipe the output to gzip:

$ mysqldump mydatabase table1 table2 table3 |
    gzip -c > dump.sql.gz

There's an option for mysqldump to dump a subset of rows.

$ mysqldump --where "created_at > '2014-03-01'" ...other options...

Of course that WHERE condition must be recognized by all tables you dump. That is, if you reference a column that doesn't exist in one of the tables, it's an error.

Mysqldump has no option for selecting a subset of columns.


Re your comment:

$ (mysqldump ...first... ; mysqldump ...second...) | gzip -c > dump.sql.gz

Altri suggerimenti

For your compression, you can run whatever you use to export your DB through a pipe into gzip like so:

mysqldump -uMyUser DatabaseName | gzip -1 > dbdump-`date +'%F-%T'`.gz

That last little bit is to give your resulting gzip a nice timestamp. If you value space more than CPU cycles, you could experiment with bzip2 and see if that gives you some savings, but in general a .gz should be fine.

Check the mysqldump manpages for information on setting which tables you wish to include (e.g. via --tables)

You could use MySQL Replication to pull data to the computer/server where you want the backup to be. Then have it create the dump/backup on the back up server so you aren't transferring a 10gb file across the network.

Autorizzato sotto: CC-BY-SA insieme a attribuzione
Non affiliato a StackOverflow
scroll top