Large data is failed to take backup with duplicty

Bug #1854351 reported by Gangadhar
6
This bug affects 1 person
Affects Status Importance Assigned to Milestone
Duplicity
In Progress
Medium
Kenneth Loafman

Bug Description

We have the 103 GB data in volume and trying to take the backup, but it is taking too much time and failed to take the backup of entire data.

duplicity 0.8.03
Data size 103 GB
endpoint is s3.us-south.objectstorage.softlayer.net
python version 3.7.3

cat /etc/os-release
NAME="Alpine Linux"
ID=alpine
VERSION_ID=3.10.1

Note : one of our customer failed to take the backup of 130GB size.

Revision history for this message
Kenneth Loafman (kenneth-loafman) wrote :

How large is the sigtar file? Do you have any logs?

Most likely you hit this bug: https://bugs.launchpad.net/duplicity/+bug/385495

It's an old one. A couple of workarounds may help:
- split the backup along directories
- use --max-blocksize=8192 or more (multiple of 512)

Changed in duplicity:
assignee: nobody → Kenneth Loafman (kenneth-loafman)
importance: Undecided → Medium
status: New → In Progress
To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.