QRecallDownloadIdentity KeysForumsSupport
  [Search] Search   [Recent Topics] Recent Topics   [Hottest Topics] Hottest Topics   [Groups] Back to home page 
I has lost my data  XML
Forum Index » General
Author Message
vincitytaymodaimo



Joined: 27-Sep-18 13:15
Messages: 2
Offline

I currently use Apple's Backup program to maintain an offsite backup of important files on dot Mac (soon to be mobileme), and I'm wondering if qrecall might do a better job. I'd like to check out some of the assumptions I'm making about how things work over a network, though, and get any other thoughts you might have before I try it.

Would the processing reduction gained by using smaller archives make it worthwhile to have separate archives for different groups of files, particularly for groups of files that I might want to update less frequently?

Capture compression should reduce the amount of backup data to be transmitted, but would it also impose additional interchanges between qrecall and the archive that would negate these savings?

Would merging and other management processes be inordinately time-consuming over the internet?

Do you have any other thought pro or con about using qrecall over the internet?
James Bucanek



Joined: 14-Feb-07 10:05
Messages: 1434
Online

QRecall (or at least the current incarnation) only works with mountable volumes. So I you say "over the internet" I assume you're speaking of a remotely mounted volume on a server or NAS device.

Compression, and QRecall's unique data analysis, can significantly reduce the amount of data your backups consume. This, in turn, can significantly reduce the amount of data you have to transmit over a network connection. (I have one customer who says recalling from a compressed archive is faster than doing a straight copy of the same files from his networked volume.)

Trying to split your archives into different types probably won't help and will ultimately introduce more overhead (due to the need of updating multiple archive documents). QRecall is really smart about compression and de-duplication, and it works best when everything is in the same archive. The only major advantage would be to split very large archives into smaller ones so that maintenance is manageable (see last answer).

In modern (multi-core) computer systems, compression adds only a modest amount of CPU and memory overhead. It's measurable, but if you can benefit from the compression it's certainly worth turning on.

Merging over a slow network connection isn't going to be particularly slow. The verify, compact, and repair actions will, on the other hand, be slow due to the requirement of reading every byte in the archive. Either schedule these tasks to run infrequently (say, once a month), or (if possible) try to perform them from a system that has a fast(er) connection to the data.

- QRecall Development -
[Email]
 
Forum Index » General
Go to:   
Powered by JForum 2.1.8 © JForum Team