QRecall Community Forum
  [Search] Search   [Recent Topics] Recent Topics   [Hottest Topics] Hottest Topics   [Top Downloads] Top Downloads   [Groups] Back to home page 
[Register] Register /  [Login] Login 

Extremely long backups RSS feed
Forum Index » Problems and Bugs
Author Message
Ralph Strauch


Joined: Oct 24, 2007
Messages: 194
Offline
I back up my MacBook Pro over wifi to a backup drive connected to an iMac. My computer usage is fairly light and my daily backups usually take under an hour, or maybe 2 at the most. I?ve had a couple of strange backups this week, with qrecall capturing what seemed like an extreme amount of data almost certainly from files that had not changed. Today?s backup seems back to normal, so I?m just reporting this (and sending you a report from the computer ) in case the data is of value to you. I?ll let you know if the behavior recurs.

My daily backup on 11/12 failed due to a network problem. I reran that backup and it completed successfully, taking about 2 hours. The next day, 11/13, the capture seemed to be going on forever. I cancelled the backup after 10 hours, when it had captured 64.4gb and written 620mb, with 99.52% duplicate. When I reran the backup the next day it took over 16 hours, capturing 138.7gb and writing 1.9 gb, with 99% duplicate. Today?s backup seemed normal, with 2gb captured and 108mb written, with 95% duplicate. (The drive being backed up contains 522gb.)
James Bucanek


Joined: Feb 14, 2007
Messages: 1568
Online
Ralph Strauch wrote:I?ve had a couple of strange backups this week, with qrecall capturing what seemed like an extreme amount of data almost certainly from files that had not changed.

Ralph,

Thanks for the diagnostic report. Out of curiosity, I wrote a script to extract the total capture action time, number of folder changes, total amount analyzed (new+changed files), and the amount of new (unique) data added to your archive.

Legend:
Circles: time the capture took to complete
Squares: number of changed folders. The spikes are an artificial number indicating where QRecall ignored the folder history and performed a deep scan of the filesystem
Orange bars: The amount of data (in GB) captured (read from new or changed files)
Red bars: The amount of unique data (in GB) added to the archive

As you can see, the long capture times correlate, pretty closely, with either (a) a deep scan of the filesystem or (b) an inordinate amount of captured data. Some of the longest capture times naturally occurred when QRecall captured the most data (the biggest four were 8GB, 12GB, 18GB, 59GB, and 139GB).

The real question is why QRecall is capturing all of this data when you believe that not much data has changed. A few of the longer captures are clearly legitimate. The 12GB and 18GB captures, for example, captured 9GB and 8.5GB of new data (respectively). So in those instances, there was at least 8GB of new data that had to be added to the archive.

The suspicious captures are the 59GB (6GB or 10% new data) and the 139GB (2GB or 1.4% new data). In these cases, it would appear that a substantial amount of the data did not change but was recaptured because QRecall though it might have changed.

Why QRecall thought the files change is anyone's guess at this point, although you could compare the file info for the files that were captured with that of pervious layers for clues. Any change in the file's creation/modification date, length, name, etc. will cause QRecall to recapture that file's data.

Another, often overlooked, cause for recapturing a file is renaming a folder. Let's say I have a "Working VMs" folder that I rename "Project VMs". QRecall sees this as a deleted folder and a new folder, which it then captures as if they were all new files. Of course, 99% of the data in the "new" folder will be duplicates to what had already been captured in the "deleted" folder, but the files are still recaptured in their entirety, which takes time.

In conclusion, I don't see anything amiss, from QRecall's perspective. The question is what's touching or renaming files that might cause QRecall to recapture large swathes of data.
  • [Thumb - capturetimes2.png]
 Filename capturetimes2.png [Disk] Download
 Description Capture time plotted against folder changes, captured data, and new data
 Filesize 52 Kbytes
 Downloaded:  1793 time(s)


- QRecall Development -
[Email]
Ralph Strauch


Joined: Oct 24, 2007
Messages: 194
Offline
James,

Here's a bit more information. I alternate two different backup drives, keeping one offsite and swapping between them every week or two. The larger captures generally represent the first backup after a drive was swapped in, with the 18GB capture on 10/25 being the first time that my 1st backup drive had seen Mavericks. I next swapped drives on 11/08, which is when the first excessive capture (59GB) occurred, and that's when the captures started getting bigger. On 11/13 I stopped the backup after more than 10 hours, when it had captured 64.4GB with 100% duplicate. The next day I restarted it and let it run, which was the one that took 16.5 hours to capture 139GB with 2GB written. The next backup was normal, as was today's.

And thanks again for the support you provide.

Ralph
 
Forum Index » Problems and Bugs
Go to:   
Mobile view
Powered by JForum 2.8.2 © 2022 JForum Team • Maintained by Andowson Chang and Ulf Dittmer