View Full Version : Offsite backup strategy, which to choose?
Oct 24, 2012, 05:11 PM
I'm in a bit of a dilemma here. I have a pretty nice backup solution at home with my originals being backed up to a mac mini server hosting time machine. It is two clients (so far) being time machined on the mini and the mini gets weekly images (superduper!) aswell. So, each computer has one original and one copy (all copies are accessible from the macmini server).
The crux is, how do I get this transferred in a nice way to a remote ftp server? I have a friend which I have setup a co-location deal with so I have access to his ftp server with enough space to house the "third copy". I would like to have a superduper!-style incremental backup (with only 10Mbps available, with >400GB full backups will take days) but I am not really sure how to do it.
These are the strategies I've already tried:
1) Transmit "folder sync", this I don't really trust to give me a good copy of for example the time-machine-bundles. Works in a pinch but then I'd rather copy the files manually using filezilla/cyberduck.
2) Expandrive and superduper!, expandrive is a bit flaky and mounting the drives doesn't always work which makes superduper unusable. Also, I have to move a .dmg-file TO the ftp server first and mount this from the mounted expandrive-drive since the file system on the ftp is not supported by superduper (some raid-stuff, I guess.. not really sure about why this is not working).
So, what to try? Anyone have had similar experiences? I could probably write a python-script to do checking for "last edited" and just transfer the changed files over some python-ftp-module, but I'm also not sure how for example the iPhoto-files are perceived by python. Having to re-transfer the whole iPhoto-library every time I change a photo is a bit messy aswell.
Oct 28, 2012, 11:45 AM
What I do is to run VPN services on my home server. This has the main filestore that I run off-site backups of. I have an old G4 mini with an external drive in an office. This runs chronosync and is scheduled to once a week, connect the VPN (if its not still running from the previous week), mount the drive and then backup. Works very well. I also run iteleport on the remote G4 so I can dial into it in case of issues.
Oct 28, 2012, 01:13 PM
You already mention sync and filezilla, so here is another alternative...
Assuming you want something other than the various cloud storage solutions and filezilla, then have you thought about git or svn (version control systems that can be run from the command line)?
Question is: do you need a bootable backup or a time machine backup, or will just backing up your own personally generated content suffice given that you can always retrieve the OS and applications by other means? If a backup of just your own personal content is required, then you might think about using git or svn for remote backup of your content. The beauty of using these version control systems for backup is that they automatically keep track of what needs to be transferred and what doesn't, and they perform the transfers in a differential manner so that the minimum amount of data needs to be communicated. In addition, you can run them over ssh meaning that all you need is a ssh terminal connection. So, your friend might be using his/her computer and at the same time you can login via ssh without disturbing his/her window session and use the git or svn commands to update (backup) your files on his/her machine. Since git/svn runs under Windows/Mac OS/linux, you can do this from most any system.
Subversion (svn) is a centralized repository system and all local copies are basically twice the size of the actual data since a copy of the committed files is also kept locally in order to determine what has been modified. Git is a distributed repository system. Both systems have their own advantages/disadvantages. While most people think of these in terms of program code for large projects (such as the linux kernel, GNU Project, etc.), they work equally well for any files that are contained in directory tree structure. They keep all back revisions made with records of who performed the revisions and comments on what was revised. Of course, the initial transfer (check out) from the repository is time consuming (all of the data must be transferred once, of course, just like any backup system), but after that only the differences between the local copies and the repository get transferred. So, once setup, these version control systems are fairly fast and easy to manage (really, only two terminal commands need to be used for 99% of all backup requirements). And, you can make as many offsite backups as you have friends with whom you make similar arrangements.
So, you might make a clone of your computer's boot drive onto an USB drive and physically give this disk to your friend. This clone is bootable and has your complete OS and applications on it, so in a worst case scenario you can always retrieve this disk clone to boot your machine from if needed. You then setup a git or svn repository of your own work (that changes continuously) on both your machine and your friends, and the both of you then use git or svn to update your constantly changing content remotely whenever you feel the need to backup remotely. Thus, unless both of your abodes burn down simultaneously, you both will always be able to recover your work, OS, and applications.
Just a thought...
Oct 29, 2012, 07:20 AM
Thanks for the ideas Switon, the problem is that I have only ftp-access to the machine (ssh is blocked by his router and he doesn't want to do the port forwarding needed since he is as paranoid as I am :) ) and I even have trouble convincing him to turn on the afp-feature on his NAS. I believe that setting up a git/svn-server would be "too much" for him :(
Although... I could try to do a svn/git check-out to a ftp-drive mounted with Expandrive.
I would prefer to have this automated without moving around physical drives etc. but it seem to be a bit difficult to get it as smooth as I would want it.
Oct 29, 2012, 10:10 AM
I use rsync
Oct 29, 2012, 12:41 PM
Since you have already opened port 21 for ftp access, why not just switch to port 22 for ssh access? The ftp is less secure than ssh anyway, and with ssh you also have scp (secure copy) and sftp (secure ftp) so if your friend wishes to use ftp he/she could just use sftp instead. Since port 22 is all that is needed for running svn over ssh, you don't have to open anything else to run a subversion repository. You don't even have to open up your firewall for AFP and the svn client on Mac OS and Windows and Linux will automatically handle whatever filesystem each is using --- no need to mount AFP/NFS/SMB filesystems over the network! (Mountain Lion, in you install Xcode, has the ssh client and server built into it already, so you don't have to install anything else. I don't know about Windows, but the various linuxes also have ssh built-in.)
The fact that svn runs over a ssh tunnel means that only a single port, 22, is required and the communications are encrypted and thus considerably more secure. You say your friend is paranoid about security, well personally I'd be much more worried about ftp security than ssh security.
Oct 30, 2012, 09:09 AM
just so you know, my suggestion needs no ports opening on the remote end, one of the reasons I went that route. It does require the remote machines internet connection to allow VPN tunnels though (but 90% of the time that is not an issue)
Oct 30, 2012, 03:30 PM
Yeah, I love VPN and use it all the time too...and it is especially useful for mounting LAN drives over the Internet.
I don't quite understand what you mean when you say that no open ports are needed on the remote end, as I believe VPN does require open ports...they are just different open ports than FTP or SSH.
VPN L2TP (more secure) protocol requires UDP ports 500, 1701, and 4500 to be open while VPN PPTP (less secure) protocol requires TCP port 1723 to be open through the router/firewall.
The OP has already opened port 21 for ftp access, while I suggested the same type of access but using ssh with port 22 open. SSH is more secure than FTP. VPN is also a good alternative, but it too requires ports to be opened, as listed above. I suggested that a version control system might be a good way to perform the backups keeping everything in sync without actually having to login to the OP's friend's machine and vice versa. For instance, using subversion over ssh would require the OP to type the command: "svn commit -m "Updating my work" in the terminal window on the OP's computer, followed by supplying his/her password, to commit and thus backup the OP's modified files/directories. The version control system automatically keeps track of what needs to be "committed", i.e., backed up. Of course, there are numerous other solutions to this remote backup problem, including using the iCloud.