Using github for full WordPress site backups

They say that you should always have a backup for your backups.

Well, if I’m being honest, for years, I haven’t had even had a backup. Not anymore.

I’m very proud to report that I am now using github as a full website(small blog).

The website is around 250mb worth of images and other minor files.

The plan

I want daily backups of any code and database changes on my WordPress site.

The backups can’t just be snapshots, I want a versioned change history.

I want this to be free and for the backup to sit somewhere I trust to stay up.

After the initial setup, I don’t want any further manual intervention.

I don’t have to do anything because I am lazy and I will forget.

The how

My thinking was simple – mysqldump, gzip that sql, write that to a file in the project folder.

Git add in the root folder, so all uploads and the changed gzip file get stashed, commit and push to remote.

The trickiness started when I came to the realisation that for the mysql dump to happen, I’d need db details and I definitely don’t want to store those details in my backup script.

Then I remembered that wp-config.php exists, why not just parse that, right?

Doing that manually sounded like a chore and ChatGPT came to the rescue.

$config = file_get_contents($wpConfigPath);
preg_match("/define\('DB_NAME', '(.+?)'\);/", $config, $dbNameMatch);
preg_match("/define\('DB_USER', '(.+?)'\);/", $config, $dbUserMatch);
preg_match("/define\('DB_PASSWORD', '(.+?)'\);/", $config, $dbPasswordMatch);
preg_match("/define\('DB_HOST', '(.+?)'\);/", $config, $dbHostMatch);

Simple as that, we have the database details now. At first I wrote mysqldump plus a gzip command, optimising the sql from 4mb to 1mb sounded like a no-brainer.

However upon further reflection, since I’m using git, that’s a bad idea. Every daily backup would mean a full new 1mb sql file, to be stored in git for eternity and that just feels wasteful.

Skipping gzip and then storing the full sql file in the database makes so much more sense. While its’s larger for the initial commit, “pays for itself” within just 4 days, as if there are no db changes, there is no new sql file!

And if there are changes, the git burden is only as large as the actual sql lines changed.

Once that’s done, we just commit, easy enough to do thanks to exec:

exec("/usr/bin/git add " . $repoPath, $output, $resultAdd);
exec("/usr/bin/git commit -m '$commitMessage'", $output, $resultCommit);
exec("/usr/bin/git push", $output, $resultPush);

Finally, the backup script needs to be triggered.

I tried a couple approaches from the command line, none worked, I think my user in shell is somewhat restricted for some reason. So ended up using the WordPress Advanced Cron Manager plugin to do the triggering.

For that we need to add the following to functions.php of the theme file:

function cron_trigger_backup_script_cron_92f1d671() {
    $scriptPath = '/home/sites/37b/3/396f39af60/public_html/backups/backup.php';
    exec("/usr/bin/php82 $scriptPath", $output, $result);
}

add_action( 'trigger_backup_script_cron', 'cron_trigger_backup_script_cron_92f1d671', 10, 0 );

Get the script

The full php backup script can be found as a gist on github.

Conclusion

I’m really happy with this. In part because I said I want to do it and I did it. As of late, I’m really big on being kind to myself and following through with promises I make to myself.

The backup works, it writes to a private github repo and I am at peace that I won’t lose my content.