← Back to team overview

duplicity-team team mailing list archive

Re: [Question #180135]: OSError: [Errno 24] Too many open files

 

Question #180135 on Duplicity changed:
https://answers.launchpad.net/duplicity/+question/180135

    Status: Open => Answered

edso proposed the following answer:
On 27.11.2011 10:20, Rodrigo Alvarez wrote:
> New question #180135 on Duplicity:
> https://answers.launchpad.net/duplicity/+question/180135
> 
> A perfectly good backup set suddenly goes wrong and verification reports "OSError: [Errno 24] Too many open files" during the attempt.  
> 
> running ulimit -n returns: 8192
> 
> Just last month this backup set passed my monthly verification test with flying colors but now it is somehow corrupted and throwing this error.  What concerns me the most is that this is not the first time it happens nor the only set.  Something in the backup process is corrupting my sets.  
> 
> To rule out hard disk corruption I compare (on a monthly basis) my backup volumes with an externally mirrored set of backup volumes (diff -rqs source target) and this does not flag any disk changes. 
> 
> In the past I've fixed this by simply biting the bullet and deleting some dates from my backup set until I could verify the set again but this involves a lot of guesswork to find the date at which the set became corrupted.   Can you please include extra feedback in duplicity so that when "OSError: [Errno 24] Too many open files" is triggered I can tell which volume and what date was being accessed?   This as a first step to try to debug why backup sets are being corrupted. 
> 

which duplicity version do you use?

what is your operating system?

this comes up from time to time. read e.g. in the mailing list
http://lists.nongnu.org/archive/cgi-bin/namazu.cgi?query=too+many+open+files&idxname=duplicity-talk

good news is: your backup is _not_ corrupt. in short, you probably have a very long backup chain and your python/os combination does either 
- not perfectly clean up open files used in duplicity
or
- not fast enough

try raising the open files limit on your platform and/or shorten your
backup chains. i for example use monthly fulls and daily backups.

good luck ede/duply.net

-- 
You received this question notification because you are a member of
duplicity-team, which is an answer contact for Duplicity.