"Not enough memory" error when trying to work with large images with partclone
Affects | Status | Importance | Assigned to | Milestone | |
---|---|---|---|---|---|
partclone (Ubuntu) |
Invalid
|
Medium
|
Unassigned | ||
Xenial |
New
|
Medium
|
Unassigned |
Bug Description
Partclone is capable of working with images that're fragmented into tiny bits, and then reconstructing them and cloning partitions.
However, with extremely large disks, the version of partclone in Xenial has a major flaw: it wants you to have memory equal to the 'disk size'.
This means that it can't work properly with creating a raw image file from fragmented disk bits.
This is fixed in later releases, and Bionic and up have 0.3.11 which includes a large number of memory improvements which gets rid of these 'not enough memory' problems.
This supposedly is also fixed in 0.2.89 per my looking online.
Note that because of this bug, partclone in Xenial is unusable for restoring files or creating raw images from segmented images (such as that which Clonezilla takes) when trying to reconstruct the partition image. (Because of this, I am setting "Medium" as the bug importance).
Bionic and later are not affected.
Changed in partclone (Ubuntu Xenial): | |
importance: | Undecided → Medium |
Changed in partclone (Ubuntu): | |
status: | New → Invalid |
Note that I was able to successfully backport the Bionic 0.3.11 version into Xenial with one change to the build dependencies (a very minor change) in a PPA, and this works without issues (and is not affected by this bug).
The PPA is available at https:/ /launchpad. net/~teward/ +archive/ ubuntu/ partclone/ +packages however it does not conform to Ubuntu changelog revision standards as this was a quick-and-dirty backport rather than one intended to fit within Ubuntu repository policy neatly.