corrupted backup, CRC check failed

Bug #682469 reported by TTimo
10
This bug affects 2 people
Affects Status Importance Assigned to Milestone
Duplicity
New
Undecided
Unassigned

Bug Description

duplicity 0.6.10 and 0.6.11

I have a 22G backup that is corrupted. That was being backed up from an OSX MacBook Pro with duplicity 0.6.10 compiled against OSX 10.5's python (Leopard). I can't check the exact version as this machine died now. The data was being backed up to Amazon S3.

I am trying to recover the backup on a Linux machine with duplicity 0.6.11 after copying the data locally, but it fails with a CRC error after about 5G of data being processed. I was able to recover more of the data past the initial CRC check failure by asking for specific folders and files, and I eventually narrowed things down to a short list of files that could not be recovered. I am attaching the error log for the first file that I can not recover.

I doubt there is much way to recover all the individual files at this point, but I thought I'd point out a few things:

- I highly recommend the --test-restore option getting done (https://bugs.launchpad.net/bugs/643973)

- It took some serious script-fu to continue recovering past the initial CRC error. There's still a significant amount of data that is lost, but it's not that bad. Duplicity should have an option to continue working through the backup and attempt to recover as much as it can.

Revision history for this message
TTimo (ttimo) wrote :
Revision history for this message
Jamus Jegier (jamus+launchpad) wrote :

I was also struggling with a CRC error over the past week. I would also request a --test-restore option and being able to pass over CRC errors.

Revision history for this message
TTimo (ttimo) wrote : Re: [Bug 682469] Re: corrupted backup, CRC check failed

I ended up with 48 files that could not be recovered. Took quite some
effort to manipulate the duplicity commands to get past the few
corrupted volumes and continue working past the initial error at 5GB.
Ended up with 22GB worth of data restored without error. I believe
with a non corrupted backup I should have ended up at 27GB or so.

On Mon, Nov 29, 2010 at 7:46 AM, Jamus Jegier <email address hidden> wrote:
> I was also struggling with a CRC error over the past week.  I would also
> request a --test-restore option and being able to pass over CRC errors.
>
> --
> corrupted backup, CRC check failed
> https://bugs.launchpad.net/bugs/682469
> You received this bug notification because you are a direct subscriber
> of the bug.
>
> Status in Duplicity - Bandwidth Efficient Encrypted Backup: New
>
> Bug description:
> duplicity 0.6.10 and 0.6.11
>
> I have a 22G backup that is corrupted. That was being backed up from an OSX MacBook Pro with duplicity 0.6.10 compiled against OSX 10.5's python (Leopard). I can't check the exact version as this machine died now. The data was being backed up to Amazon S3.
>
> I am trying to recover the backup on a Linux machine with duplicity 0.6.11 after copying the data locally, but it fails with a CRC error after about 5G of data being processed. I was able to recover more of the data past the initial CRC check failure by asking for specific folders and files, and I eventually narrowed things down to a short list of files that could not be recovered. I am attaching the error log for the first file that I can not recover.
>
> I doubt there is much way to recover all the individual files at this point, but I thought I'd point out a few things:
>
> - I highly recommend the --test-restore option getting done (https://bugs.launchpad.net/bugs/643973)
>
> - It took some serious script-fu to continue recovering past the initial CRC error. There's still a significant amount of data that is lost, but it's not that bad. Duplicity should have an option to continue working through the backup and attempt to recover as much as it can.
>
> To unsubscribe from this bug, go to:
> https://bugs.launchpad.net/duplicity/+bug/682469/+subscribe
>

To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.