Joined: 14-Feb-07 10:05
The solution will depend on what your goals are. If you just want to create a second, courser-grained, duplicate of your data the most efficient (and convenient) solution is to create a second capture strategy. Let's say you capture "My HD" to the "Upstair" archive every few hours. You could schedule a second capture of "My HD" to the "Downstairs" archive on the Time Capsule to run once a week. Schedule it to merge, compact, and verify about once a month.
This would perserve all of your critical data daily, and give you a weekly backup should your primary backup system fail.
If you're trying to maintain a complete backup of your regular backup, then simply copying the backup on a regular basis is probably the simplest and most efficient solution. There are a number of file copy/sync/cloning utilities that will schedule a copy to occur on a regular basis (UNIX geeks can do it with cron and cp). You'll want to avoid starting a copy while a QRecall action was in progress so that you don't make a copy of an archive that's in flux. (It's not a disaster if it happens; you'd still preserve most of the data, but you'd probably have to repair the archive before you could use it.)
You could use QRecall to capture your primary archive to another archive, but it wouldn't be as fast a straight copy. For one thing, it would mean that you'd have to recall the entire archive from the secondary archive before you could do anything with it. But the other problem is efficiency. That method, and utilities like rsync, encounter the same overhead: The entire source and destination archives have to be read in their entirety, and then any changes detected are written to the new archive. A straight copy reads the source file once and write the destination file once, which is actually quicker.
Rsync does work well between two computer systems, where an instance of the rsync program can be started on the remote computer. This is how I keep my off-site backups. I have a couple of modest archives, one on my development system and one on my server (located in a data center across town), where I regularly capture my most critical files—like the source code to QRecall. Once a day I rsync all of them. The archive files are read locally by their respective rsync processes (which is quite fast) and any differences (usually fairly small) are transfered over the Internet. Keeping 100GB of data synced between the two usually takes about a hour each morning. By comparison, it would take almost an entire day to transfer that data through my cable-modem connection. But this only works because rsync can run on both machines; if you run rsync on a Time Capsule rsync will fall back to running one process on your local computer and read all of the remote (Time Capsule) data over the network—which is no faster than a copy.
Another good solution is to create rotating backups. Set up two identical capture strategies to different archives, let's call them VaultA and VaultB. Create VaultA on one removable drive and VaultB on a second removable drive. On all of the actions, set the "Ignore if no archive" condition. Now, plug the drive with VaultA into your computer and let it capture your files to it on a regular basis for a week or a month. Unplug the drive and take it to an off-site location, like a safety deposit box, return home and plug in the drive with VaultB. Repeat the process each week or monthly. By retrieving the second drive, you'll have access to all of your captured data going back for years, and if a meteor strikes your home or bank one night (cross your fingers that it's the later), you still have a fairly recent backup of everything.
About mid-way down the list of "things I'd like to add to QRecall" are fall-over archives. Basically, the ability to schedule an action that would transfer just what's changed in one archive directly to another archive (or in some other off-site/Internet friendly format). I think that's exactly what you want, but that's going to take some engineering.