SSLError while listing large WebDAV directory

Bug #746292 reported by Thomas Tanner on 2011-03-31
6
This bug affects 1 person
Affects Status Importance Assigned to Milestone
Duplicity
Undecided
Unassigned

Bug Description

I'm using the WebDAV backend for the backup to a storage space provided by large German hosting provider.
Starting with a empty directory on the server everything works fine until after about 54 incremental backups
I always get the following error (Ubuntu 10.04):

Start duply v1.5.2.3, time is 2011-03-31 10:13:22.
Using profile '/etc/duply/hourly'.
Using installed duplicity version 0.6.12, gpg 1.4.10 (Home: ~/.gnupg)
Test - Encryption with key 1454AE98 (OK)
Test - Decryption with key 1454AE98 (OK)
Test - Compare Original w/ Decryption (OK)
Cleanup - Delete '/tmp/duply.1007.1301559202_*'(OK)

--- Start running command INCR at 10:13:22.292 ---
Using archive dir: /root/.cache/duplicity/duply_hourly
Using backup name: duply_hourly
Import of duplicity.backends.ftpbackend Succeeded
Import of duplicity.backends.giobackend Failed: No module named gio
Import of duplicity.backends.rsyncbackend Succeeded
Import of duplicity.backends.localbackend Succeeded
Import of duplicity.backends.webdavbackend Succeeded
Import of duplicity.backends.imapbackend Succeeded
Import of duplicity.backends.tahoebackend Succeeded
Import of duplicity.backends.botobackend Succeeded
Import of duplicity.backends.sshbackend Succeeded
Import of duplicity.backends.hsibackend Succeeded
Import of duplicity.backends.cloudfilesbackend Succeeded
Using WebDAV host webdav.hidrive.strato.com
Using WebDAV directory /users/me/hourly/
Using WebDAV protocol http
Reading globbing filelist /etc/duply/hourly/exclude
Main action: inc
================================================================================
duplicity 0.6.12 (March 08, 2011)
Args: /usr/bin/duplicity incr --name duply_hourly --encrypt-key 1454AE98 --sign-key 1454AE98 --verbosity 9 --num-retries 5 --exclude-globbing-filelist /etc/duply/hourly/exclude / webdavs://<email address hidden>/users/me/hourly
Linux myhost 2.6.33.7-vs2.3.0.36.30.4 #1 SMP Tue Nov 16 08:24:31 UTC 2010 x86_64
/usr/bin/python 2.6.5 (r265:79063, Apr 16 2010, 13:57:41)
[GCC 4.4.3]
================================================================================
Using temporary directory /tmp/duplicity-JDYTLm-tempdir
Registering (mkstemp) temporary file /tmp/duplicity-JDYTLm-tempdir/mkstemp-_cV4ZB-1
Temp has 37880573952 available, backup will use approx 34078720.
Listing directory /users/me/hourly/ on WebDAV server
WebDAV PROPFIND attempt #1 failed: 200
Listing directory /users/me/hourly/ on WebDAV server
Removing still remembered temporary file /tmp/duplicity-JDYTLm-tempdir/mkstemp-_cV4ZB-1
Traceback (most recent call last):
  File "/usr/bin/duplicity", line 1261, in <module>
    with_tempdir(main)
  File "/usr/bin/duplicity", line 1254, in with_tempdir
    fn()
  File "/usr/bin/duplicity", line 1155, in main
    sync_archive()
  File "/usr/bin/duplicity", line 931, in sync_archive
    remlist = globals.backend.list()
  File "/usr/lib/python2.6/dist-packages/duplicity/backends/webdavbackend.py", line 165, in list
    response = self.request("PROPFIND", self.directory, self.listbody)
  File "/usr/lib/python2.6/dist-packages/duplicity/backends/webdavbackend.py", line 107, in request
    response = self.conn.getresponse()
  File "/usr/lib/python2.6/httplib.py", line 986, in getresponse
    response.begin()
  File "/usr/lib/python2.6/httplib.py", line 391, in begin
    version, status, reason = self._read_status()
  File "/usr/lib/python2.6/httplib.py", line 349, in _read_status
    line = self.fp.readline()
  File "/usr/lib/python2.6/socket.py", line 397, in readline
    data = recv(1)
  File "/usr/lib/python2.6/ssl.py", line 96, in <lambda>
    self.recv = lambda buflen=1024, flags=0: SSLSocket.recv(self, buflen, flags)
  File "/usr/lib/python2.6/ssl.py", line 222, in recv
    raise x
SSLError: The read operation timed out

10:13:52.695 Task 'INCR' failed with exit code '30'.

----------------------------------------------------------
if I empty the directory, perform a full backup and then a incremental backup
the output is:

Listing directory /users/me/hourly/ on WebDAV server
WebDAV PROPFIND attempt #1 failed: 200
Listing directory /users/me/hourly/ on WebDAV server
<?xml version="1.0" encoding="utf-8"?>
<D:multistatus xmlns:D="DAV:" xmlns:ns0="DAV:">
<D:response xmlns:lp1="DAV:" xmlns:lp2="http://apache.org/dav/props/">
<D:href>/users/me/hourly/</D:href>
<D:propstat>
<D:prop>
<lp1:resourcetype><D:collection/></lp1:resourcetype>
<lp1:creationdate>2011-03-31T08:18:49Z</lp1:creationdate>
<lp1:getlastmodified>Thu, 31 Mar 2011 08:18:49 GMT</lp1:getlastmodified>
<lp1:getetag>"5-5-49fc2f16efdb0"</lp1:getetag>
<D:supportedlock>
<D:lockentry>
<D:lockscope><D:exclusive/></D:lockscope>
<D:locktype><D:write/></D:locktype>
</D:lockentry>
<D:lockentry>
<D:lockscope><D:shared/></D:lockscope>
<D:locktype><D:write/></D:locktype>
</D:lockentry>
</D:supportedlock>
<D:lockdiscovery/>
</D:prop>
<D:status>HTTP/1.1 200 OK</D:status>
</D:propstat>
</D:response>
<D:response xmlns:lp1="DAV:" xmlns:lp2="http://apache.org/dav/props/">
<D:href>/users/me/hourly/duplicity-full.20110331T081843Z.vol1.difftar.gpg</D:href>
<D:propstat>
<D:prop>
<lp1:resourcetype/>
<lp1:creationdate>2011-03-31T08:18:49Z</lp1:creationdate>
<lp1:getcontentlength>4429620</lp1:getcontentlength>
<lp1:getlastmodified>Thu, 31 Mar 2011 08:18:49 GMT</lp1:getlastmodified>
<lp1:getetag>"179-439734-49fc2f1693948"</lp1:getetag>
<lp2:executable>F</lp2:executable>
<D:supportedlock>
<D:lockentry>
<D:lockscope><D:exclusive/></D:lockscope>
<D:locktype><D:write/></D:locktype>
</D:lockentry>
<D:lockentry>
<D:lockscope><D:shared/></D:lockscope>
<D:locktype><D:write/></D:locktype>
</D:lockentry>
</D:supportedlock>
<D:lockdiscovery/>
</D:prop>
<D:status>HTTP/1.1 200 OK</D:status>
</D:propstat>
</D:response>

----------------------------------------------------------
Is this a bug on the server side? is there a workaround?

To post a comment you must log in.
This report contains Public information  Edit
Everyone can see this information.

Other bug subscribers