Question

I just discovered that the automated dumps I've been creating of my SVN repository have been getting cut off early and basically only half the dump is there. It's not an emergency, but I hate being in this situation. It defeats the purpose of making automated backups in the first place.

The command I'm using is below. If I execute it manually in the terminal, it completes fine; the output.txt file is 16 megs in size with all 335 revisions. But if I leave it to crontab, it bails at the halfway mark, at around 8.1 megs and only the first 169 revisions.

# m h  dom mon dow   command
18 00 * * * svnadmin dump /var/svn/repos/myproject > /home/andrew/output.txt 

I actually save to a dated gzipped file, and there's no shortage of space on the server, so this is not a disk space issue. It seems to bail after two seconds, so this could be a time issue, but the file size is the same every single time for the past month, so I don't think it's that either. Does crontab execute within a limited memory space?

Was it helpful?

Solution

So, I don't know what the real issue is here, but if I route the STDERR of svnadmin to /dev/null when I do the dump, everything goes well. I tried using the "quiet" flag (-q) and it also succeeds. I'm assuming that when a shell script running from a crontab encounters enough text in STRERR, it stops execution of whatever it's running and goes to the next instruction. I've done an MD5 on a manual file and a scheduled file and they are identical. This seems to be resolved. So if anyone encounters this issue themselves, this is the shell script I used to successfully get past the early truncation. It's a little verbose. Sorry.

#!/bin/sh
echo "STARTING AT $(date +\%Y/\%m/\%d/T%I:\%M:\%S)" >> /home/andrew/svnlog.txt
rm /tmp/andrewMobileApp.dump
svnadmin dump /var/svn/repos/andrewMobileApp > /tmp/andrewMobileApp.dump 2>/dev/null
echo "svnadmin exited with code $?" >> /home/andrew/svnlog.txt
gzip -c /tmp/andrewMobileApp.dump > "/home/andrew/svnbackups/andrewMobileApp.dump.$(date +\%Y\%m\%d\%I\%M\%S).txt.gz"
echo "gzip exited with code $?" >> /home/andrew/svnlog.txt
echo "DONE AT $(date +\%Y/\%m/\%d/T%I:\%M:\%S)" >> /home/andrew/svnlog.txt
echo  "-----" >> /home/andrew/svnlog.txt

This script is invoked through a super-user crontab.

OTHER TIPS

First of all, make sure that svnadmin is on the PATH environment variable of the cron job. You may have to specify a full path for /usr/bin/svnadmin or whatever is appropriate.

Also, rather than svnadmin dump, you may want to look into svnadmin hotcopy which is a tool intended for backups of the repository.

I submitted this as a bug in the Debian cron package, since I was experiencing it there as well (under a vserver). See bug #577133 in Debian. Christian Kastner patched the bug, by adding code to bypass all the mail processing code if no MTA is found.

Do you have size limit on the users directory? Might want try to dump it to temp first.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top