Using Transmit To Backup Your Referenced Masters
by Micah Walter
This is a nice feature theoretically in that you can stop the process and return and begin again, but it can cause problems when trying to upload a large amount of data. When the cache gets filled, the Synk operation can get interrupted. It doesn’t always happen, but on occasion I have to start all over again. If the Synk operation gets interrupted for some reason it fails the backup and you have to try again.
JungleDisk also uses a fairly interesting method of storing your files. Instead of just creating folders and files in your S3 bucket, JungleDisk creates a flattened directory structure, using the folder names you create as part of the filename. It all works perfectly fine if you always use JungleDisk to interface with your S3 account, but once you try and connect with some other application things can be a bit confusing.
So, I am experimenting with Panic’s Transmit. Transmit has been around forever and has served as a great FTP program for me for a long time. Now that there is support for Amazon S3 I have yet another use for the fine program. What’s more is that Transmit offers additional features such as .Mac preference syncing and built in Automator actions.
I set up my Amazon account and saved it as a preference. I created a new “bucket” and then made some sub-folders. I pointed to the folder of pictures I wanted to upload and clicked Synchronize. I was given a number of options as to how I wanted the synching to behave and it is off and running. It seems to write the files one at a time, copying them from my network drive to the laptop and then uploading them to S3. It is going fairly slowly, but seems to working without any problems.
It would be really nice to eventually get everything up on S3 and then be able to just do a sync for any newly added files. This would keep a really nice archive of my Master images files up on a geographically redundant server out there in the ether. Of course an Aperture plugin might be a nice idea as well! Hint Hint....
After the first article I tried the S3 approach as a way of backing-up Aperture data as a vault but the initial transfer was always too slow on broadband.In the end, after trying it overnight, I backed away slowly and then used Transmit and Interarchy to clear the mess of the S3 servers. Hours of playing about but ridiculously cheap at the end of the day.