March 7, 2018 by Daniel P. Clark

Micro Backups of Your Git Dev Work

When writing software there are many times where we try out changes to see what occurs.  Sometimes this will bring us to bugs in projects that we depend on and depending on the issue we may simply work around it.  When this happens it’s very likely that the non-working code isn’t committed to a software versioning system such as git, but is rather discarded for a working change.  Because who doesn’t want to write good code, keep up a fair reputation for quality, and not break things for others.  The only problem with this is that so much useful information from those errors can be lost when really it could be of benefit to reference later and fix them.

This blog post is written to illustrate a simple way to keep incremental backups of your project states with the logs for record.

Wrapping a Command in Fish Shell

Now you can choose whichever shell you’d like to implement this in.  I prefer the Fish shell as it is a pleasure to work with.  You can take what you learn from this and make it your own.

function rake
  set -l PROJECT (basename (pwd))
  set -l TIMESTAMP (date +%s)
  set -l ROOT "/mnt/raid/snapshots/"

  bash -c "set -o pipefail; rake $argv |& tee $BASENAME.log"

  set -l STATUS $status

  bash -c "set +o pipefail"

  for file in (git ls-files) (git ls-files --exclude-standard --others)
    if echo $file | grep assets > /dev/null
    tar --no-recursion -uf $BASENAME.tar $file

  gzip $BASENAME.tar

  if test "$STATUS" -ne "0" 

What this script does is creates both a log of your rake commands output and a gzipped backup of your projects current folder minus any files listed in your .gitignore file.  The log and the gzip files will have the same name and be stored in the same place with the name of your project folder and a timestamp.

Be sure to be generous with your list of files to ignore in your .gitignore file as this script will grab uncommitted files that aren’t blacklisted there.  Anything not in the .gitignore you can put in the grep line I have listed above in the for loop. That skips any file or file with folder with assets in the name.

The only thing left to do is set up a cron job to delete archives that are beyond a certain age.  3 to 7 days should be a good number for this.  That’s the amount of time you’re most likely going to wish you had that info available to find.


Backups are important, but failures can contain valuable data.  Keep a cache of your recent work and enable yourself to help other projects when their systems have bugs with data that’s readily available.  Don’t do what I did when I submitted an Github issue which I don’t have the proof of anymore.  “Keep it secret, keep it safe” — Gandalf

Feature image by Ian via the Creative Commons Attribution-NonCommercial 2.0 Generic License.



  1. Dmitry
    January 22, 2020 - 4:02 pm

    I’m using borgbackup, it has everything you may need, including incremental backups, compression, encryption, possibility to cron, cleanup archive.

Leave a Reply

Your email address will not be published / Required fields are marked *