Crash "No Such File Or Directory"
Affects | Status | Importance | Assigned to | Milestone | |
---|---|---|---|---|---|
sbackup (Ubuntu) |
Invalid
|
Undecided
|
Oumar Aziz OUATTARA |
Bug Description
Binary package hint: sbackup
I backed up my home directory with Ubuntu 7.10 to a NAS. I just installed (03/29/2008) Ubuntu 8.04 Beta. Sbackup crashes EVERY SINGLE TIME I try to restore a directory from my NAS.
I am attempting to restore the directory /home/nhilton/
Here is the traceback:
PythonArgs: ['/usr/
Traceback:
Traceback (most recent call last):
File "/usr/sbin/
self.
File "/usr/sbin/
r.restore( tdir, src, dst )
File "/usr/share/
shutil.move( os.path.
File "/usr/lib/
copy2(src,dst)
File "/usr/lib/
copyfile(src, dst)
File "/usr/lib/
fsrc = open(src, 'rb')
IOError: [Errno 2] No such file or directory: '/home/
After the crash, the temp dir "/home/
This is critical, I can not restore any files!
This is NOT a bug, it turns out that the created archive was corrupt.
Manually trying to extract the archive gave me a "unexpected end of file" on the files.tgz that sbackup made. Looking closer at it's size, files.tgz was exactly 2147483647, 1 byte short of 2GB. I bet this failure occurs when the archive is not extracted and there are no files to copy to the proper location.
So perhaps the real bug is that sbackup is unable to create tgz files larger that 2GB?
Maybe I didn't notice my backup failed on Ubuntu 7.10 when I created it. sbackup did successfully create an incremental backup using the "full" backup. I bet sbackup doesn't even check files.tgz when doing an incremental, but just checks what file versions have changed by looking at flist or something.
So, the real bug is that sbackup is unable to create tgz files larger that 2GB!