Getting the "metadata not found in archive, no files restored" error
I have a ssd primary drive with a soft symlink to another drive for all home folders.
The ~/.cache/deja-dup/metadata exists on that second drive.
Questions:
For duplicity does archiving a folder under its symlinked and real location cause clashing?
Does duplicity it back it up twice?
Is duplicity still executing when deja-dup goes to delete the metadata folder. Since this is async processing in deja.
Test:
Ran deja from commad line to reproduce error. Here is how to get the command args and errors from deja.
export DEJA_DUP_DEBUG=1
deja-dup --backup > dump
The folder to be archived (/media/storage/home/kirk/) is the actual location where my $home points to
only see one include.
There are differences in pointing directly to the symlink vs the real location.
The two tests differ by the number of includes.
The two tests seem to be generally composed set of excludes but different ordering.
The .cache/deja-dup is excluded in both cases so why would it ever get purged? Indeed the metadata file goes missing.
There is code in several places in deja-dup to delete this file that it had itself created.
The deja-dup does delete this metadata file right before it goes to verify against it.
OperationBackup.vala
....
internal async override void operation_finished(ToolJob job, bool success, bool cancelled, string? detail)
{
/* If successfully completed, update time of last backup and run base operation_finished */
if (success)
DejaDup.update_last_run_timestamp(DejaDup.TimestampType.BACKUP);
if (metadir != null)
new RecursiveDelete(metadir).start();
Getting the "metadata not found in archive, no files restored" error
I have a ssd primary drive with a soft symlink to another drive for all home folders. deja-dup/ metadata exists on that second drive.
The ~/.cache/
Questions:
For duplicity does archiving a folder under its symlinked and real location cause clashing?
Does duplicity it back it up twice?
Is duplicity still executing when deja-dup goes to delete the metadata folder. Since this is async processing in deja.
Test:
Ran deja from commad line to reproduce error. Here is how to get the command args and errors from deja.
export DEJA_DUP_DEBUG=1
deja-dup --backup > dump
The folder to be archived (/media/ storage/ home/kirk/ ) is the actual location where my $home points to
only see one include.
DUPLICITY: . Args: /usr/bin/duplicity /media/ storage/ backup /media/ storage/ home/kirk/ Downloads /media/ storage/ home/kirk/ .local/ share/Trash /media/ storage/ home/kirk/ .cache/ deja-dup/ tmp /media/ storage/ home/kirk/ .xsession- errors /media/ storage/ home/kirk/ .thumbnails /media/ storage/ home/kirk/ .adobe/ Flash_Player/ AssetCache /media/ storage/ home/kirk/ .cache/ deja-dup /media/ storage/ home/kirk/ .cache /media/ storage/ home/kirk /home/kirk/ Downloads /home/kirk/ .local/ share/Trash /home/kirk/ .cache/ deja-dup/ tmp /home/kirk/ .xsession- errors /home/kirk/ .thumbnails /home/kirk/ .steam/ root /home/kirk/ .recently- used.xbel /home/kirk/ .recent- applications. xbel /home/kirk/ .Private /home/kirk/ .gvfs /home/kirk/ .adobe/ Flash_Player/ AssetCache /home/kirk/ .cache/ deja-dup /home/kirk/ .cache /media/ storage/ backup --no-encryption --verbosity=9 --gpg-options= --no-use- agent --archive- dir=/home/ kirk/.cache/ deja-dup --tempdir=/tmp --log-fd=15
--exclude=
--exclude=
--exclude=
--exclude=
--exclude=
--exclude=
--exclude=
--exclude=
--exclude=
--include=
--include=/etc
--exclude=
--exclude=
--exclude=/sys
--exclude=/run
--exclude=/proc
--exclude=
--exclude=/var/tmp
--exclude=/tmp
--exclude=
--exclude=
--exclude=
--exclude=
--exclude=
--exclude=
--exclude=
--exclude=
--exclude=
--exclude=
--exclude=** --gio --dry-run --volsize=50 / file://
The folder to be archived is just the $home which points to (/media/ storage/ home/kirk/ )
can see two includes which point to the same drive location
DUPLICITY: . Args: /usr/bin/duplicity /media/ storage/ backup --include=/etc /home/kirk/ Downloads /home/kirk/ .local/ share/Trash /home/kirk/ .cache/ deja-dup/ tmp /home/kirk/ .xsession- errors /home/kirk/ .thumbnails /home/kirk/ .steam/ root /home/kirk/ .recently- used.xbel /home/kirk/ .recent- applications. xbel /home/kirk/ .Private /home/kirk/ .gvfs /home/kirk/ .adobe/ Flash_Player/ AssetCache /home/kirk/ .cache/ deja-dup /home/kirk/ .cache /home/kirk /media/ storage/ home/kirk/ Downloads /media/ storage/ home/kirk/ .local/ share/Trash /media/ storage/ home/kirk/ .cache/ deja-dup/ tmp /media/ storage/ home/kirk/ .xsession- errors /media/ storage/ home/kirk/ .thumbnails /media/ storage/ home/kirk/ .adobe/ Flash_Player/ AssetCache /media/ storage/ home/kirk/ .cache/ deja-dup /media/ storage/ home/kirk/ .cache /media/ storage/ home/kirk /media/ storage/ backup --no-encryption --verbosity=9 --gpg-options= --no-use- agent --archive- dir=/home/ kirk/.cache/ deja-dup --tempdir= /home/kirk/ .cache/ deja-dup/ tmp --log-fd=15
--exclude=
--exclude=
--exclude=
--exclude=
--exclude=
--exclude=
--exclude=
--exclude=
--exclude=
--exclude=
--exclude=
--exclude=
--exclude=
--exclude=
--include=
--exclude=
--exclude=
--exclude=
--exclude=
--exclude=
--exclude=
--exclude=
--exclude=
--include=
--exclude=/sys
--exclude=/run
--exclude=/proc
--exclude=/var/tmp
--exclude=/tmp
--exclude=** --gio --dry-run --volsize=50 / file://
Observation:
Either way both test error the same.
There are differences in pointing directly to the symlink vs the real location.
The two tests differ by the number of includes.
The two tests seem to be generally composed set of excludes but different ordering.
The .cache/deja-dup is excluded in both cases so why would it ever get purged? Indeed the metadata file goes missing.
There is code in several places in deja-dup to delete this file that it had itself created.
The deja-dup does delete this metadata file right before it goes to verify against it.
OperationBackup .vala finished( ToolJob job, bool success, bool cancelled, string? detail) update_ last_run_ timestamp( DejaDup. TimestampType. BACKUP) ;
....
internal async override void operation_
{
/* If successfully completed, update time of last backup and run base operation_finished */
if (success)
DejaDup.
if (metadir != null) (metadir) .start( );
new RecursiveDelete
if (success && !cancelled) finished( job, success, cancelled, detail);
yield chain_op(new OperationVerify(), _("Verifying backup…"), detail);
else
yield base.operation_
}
OperationVerify .vala finished( ToolJob job, bool success, bool cancelled, string? detail)
FileUtils. get_contents( Path.build_ filename( metadir. get_path( ), "README"), out contents);
...
internal async override void operation_
{
// Verify results
if (success) {
var verified = true;
string contents;
try {
}
catch (Error e) {
verified = false;
}
if (verified) { split(" \n");
var lines = contents.
verified = (lines[0] == "This folder can be safely deleted.");
}
if (!verified) {
raise_ error(_ ("Your backup appears to be corrupted. You should delete the backup and try again."), null);
success = false;
}
if (nag)
update_ nag_time( );
}
new RecursiveDelete (metadir) .start( );
yield base.operation_ finished( job, success, cancelled, detail);
}