So with all this talk of redundant, synchronous, and autonomous backup/cloud services like Dropbox and more recently Google Drive have got me thinking; Why sacrifice both space and privacy for cloud storage? Why not setup my own autonomous, synchronous cloud-like backup solution?
So, call me crazy, I think it may be possible to not rely on a third party service for internet based data storage!?! Nuts right? after arguing on some forums with folks @ http://www.theverge.com I've decided to prove to them that I can serve up a DIY alternative to Google Drive or Dropbox. Ideally, I'm looking to setup a backup using both rsync and unison that is triggered with a bash script, then the data would be accesable from the server via a LAMP setup. Ambitious, I know, but worth it.
Let's begin…
So here is a script I've modified to fit my needs:
#!/bin/sh set -u set -x HOST=andrew@67.241.242.136 SRC=/Users/asow123/Cloud DST=/Users/asow123/Cloud if ssh "$HOST" test -d "$DST"/backup.0; then ssh "$HOST" rm -rf "$DST"/backup.3 ssh "$HOST" mv "$DST"/backup.2 "$DST"/backup.3 ssh "$HOST" mv "$DST"/backup.1 "$DST"/backup.2 ssh "$HOST" mv "$DST"/backup.0 "$DST"/backup.1 fi rsync --progress --archive --delete -F --rsh=ssh --link-dest=../backup.1 \ "$SRC" "$HOST":"$DST"/backup.pre && ssh "$HOST" mv "$DST"/backup.pre "$DST"/backup.0
The script will make backups of the /Developer/cloud and place them into a backup file made by the script. the script will make a series of 3 backups in order. Once that limit is reached the oldest file is deleted and the newest replaces it.
So After careful consideration, I've come to the conclusion that bidirectional synchronization is the way to go. Unison is a great program that does just that with minimal configuration.
Here is the script that synchronizes my two folders:
#!/bin/bash _paths="/home/andrew/Cloud" _unison=/usr/local/Cellar/unison/2.40.63/bin/unison _rserver="67.241.242.136" for r in ${_rserver} do for p in ${_paths} do ${_unison} -batch /Users/asow123/Cloud "ssh://${r}/${p}" done done
Running this script every time I want to backup would be inconvenient, what I've done is schedule this script in a cron tab. in order for the cron tab to connect automatically I've added a freshly generated id_dsa.pub ssh key and added it to authorized_keys on the server side in the ~./ssh folder