Question

Bit of a curly one that I haven't been able to find the answer to. I have a pre-commit hook for git which mysqldumps the database, and stages the file for committing.

I'm running Drupal for a project, and thus, quite a bit of config is done through the interface and is stored in the database, and obviously, I'm keen to capture point in time milestones in version control.

Here's the basic cut down test that I'm trying this with:

#!/bin/bash

#commit database as point-in-time
SITE_PATH="/data/site.com"
cd $SITE_PATH

FILE_NAME="db/site.sql"
$(mysqldump --skip-extended-insert --skip-dump-date db_name > $FILE_NAME)

#add changed file to commit
$(git add $FILE_NAME)

The issue is that if there are no files to be staged, the pre-commit script runs, but the commit isn't completed. If you run echo $? afterwards, git commit returns an exit code of 1 (because there were no files initially in staging). Running git status shows that the db dump is now staged ready for commit though.

The commit works if I have any other file added to staging before running the commit.

I can obviously easily write a wrapper that does what I want and then calls git commit at the end after everything has been staged in advance, but can anyone shed some light on best practices for this situation?

Was it helpful?

Solution

git commit --allow-empty

That will stop git from aborting a commit which hasn't changed any files from the previous commit.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top