← Back to team overview

duplicity-team team mailing list archive

Re: [Question #108413]: Suggestion, hash backup files to insure they are not corrupted

 

Question #108413 on Duplicity changed:
https://answers.launchpad.net/duplicity/+question/108413

    Status: Answered => Open

Alex Robinson is still having a problem:
>From the man page I got the impression that verify would check the
current state of the "source" - the data-to-be-backed-up - with the
current state of the backup. I would expect that the backup's sig files
would contain data sufficient to handle this job. And, if the sig files
are duped on the source machine (so that they do not need to be sent
back to the source machine for verification), then the verify operation
would only need to hash them on both ends to insure that they are the
same - before running a check against the local sig files and the source
data. Hence, to verify there is no reason to even read the actual data
backup files on the remote, backup machine. Or, does verify both do a
remote hash of the sig files on the remote machine AND validate on the
remote machine that the sigs match the backup data files as they exist
at the moment? If so, then gpg would not need to be involved (assuming
that the sig files include hashes of the encrypted data).

-- 
You received this question notification because you are a member of
duplicity-team, which is an answer contact for Duplicity.