Question

When I dump a Sybase database, it doesn't seem to matter whether there's data in the tables or not, the file size is the same. I've been told that this is down to the fact that my dump file is binary and not logical, so the file of the dump file is based on the allocated size of the database. I know that Oracle can use logical dump files, but can I get Sybase to do the something similar, or is there any other sneaky ways of getting the dump file size down?

Was it helpful?

Solution

Since somewhere around version 12 you have been able to perform compressed dumps in ASE.

The syntax is: dump database database_name to file_name [ with compression=compress_level]

compress_level is 0-9. 0 is no compression and 9 is the most. The more you compress the higher the CPU usage whilst running the dump. You just need to peform a little testing to find an appropriate balance of size versus performance.

No special commands are needed to load the dump.

OTHER TIPS

If you're already using compress_level 9 and still want more compression, it's possible to re-compress the file with bzip2.

If you simply bzip2 the compressed file, you will get ~10% improvement. If you uncompress and re-compress you may see improvements in the 30% range. Bear in mind though, that you'll have to un-compress and/or gzip the file again in order for Sybase to load it.

gunzip -c pubs_1.dmp | bzip2 > pubs.dmp.bz2

Whilst the link (The syntax is) above is obviously correct as it points to sybase documentation, the comments are misleading.

The syntax in simple format is:

dump database {database_name} to "compress::{#compression_level}::{stripe_device}" go

eg: dump database mydb to "compress::1::/sybase_dumps/mydb_17022009"

In terms of loading the database dump the ::compress;option needs to be given again.

eg. load database mydb from "compresss::/sybase_dumps/mydb_17022009"

Notice the compression level is not required nor the extra separating colons that follow.

The testing to find your balance is a good point, remeber the higher you go expect the dump to take considerably longer. I find 1-3 is more than ample, I have never gone above 6 the diminishing returns are not worth it.

If I was desparate I would bzip2 the file as described above {point earned}. If this was a production host I would ship the file to another host and do this. Resource hit can be considerable.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top