BadStatusLine when trying to use webdavs

Bug #511705 reported by Ralf Herold
56
This bug affects 9 people
Affects Status Importance Assigned to Milestone
Duplicity
Fix Released
Undecided
Unassigned
Debian
New
Undecided
Unassigned

Bug Description

This is a new error message, previously backing up to webdavs worked fine for me.
Many thanks for your work and the excellent program -

Duplicity version: 0.6.06
Python version: 2.6.4
OS Distro and version: Ubuntu 9.10
Type of target filesystem: webdavs://sd2dav.1und1.de

Traceback (most recent call last):
  File "/usr/bin/duplicity", line 1236, in <module>
    with_tempdir(main)
  File "/usr/bin/duplicity", line 1229, in with_tempdir
    fn()
  File "/usr/bin/duplicity", line 1211, in main
    incremental_backup(sig_chain)
  File "/usr/bin/duplicity", line 487, in incremental_backup
    globals.backend)
  File "/usr/bin/duplicity", line 315, in write_multivol
    (tdp, dest_filename)))
  File "/usr/lib/python2.6/dist-packages/duplicity/asyncscheduler.py", line 148, in schedule_task
    return self.__run_synchronously(fn, params)
  File "/usr/lib/python2.6/dist-packages/duplicity/asyncscheduler.py", line 175, in __run_synchronously
    ret = fn(*params)
  File "/usr/bin/duplicity", line 314, in <lambda>
    async_waiters.append(io_scheduler.schedule_task(lambda tdp, dest_filename: put(tdp, dest_filename),
  File "/usr/bin/duplicity", line 240, in put
    backend.put(tdp, dest_filename)
  File "/usr/lib/python2.6/dist-packages/duplicity/backends/webdavbackend.py", line 256, in put
    response = self.request("PUT", url, source_file.read())
  File "/usr/lib/python2.6/dist-packages/duplicity/backends/webdavbackend.py", line 108, in request
    response = self.conn.getresponse()
  File "/usr/lib/python2.6/httplib.py", line 974, in getresponse
    response.begin()
  File "/usr/lib/python2.6/httplib.py", line 391, in begin
    version, status, reason = self._read_status()
  File "/usr/lib/python2.6/httplib.py", line 355, in _read_status
    raise BadStatusLine(line)
BadStatusLine

Revision history for this message
Chris H. (christoph-herderich) wrote :

Ralf,

I have the same traceback with Opensuse 11.2; Python 2.6 and duplicity 6.6 on webdavs://sd2dav.1und1.de

What I observed is that everything works fine (without traceback) if the directory on the webdavs://sd2dav.1und1.de server
contains only few files.

Starting from approx. 200 files I get the message "sometimes".
Operation stopped in my case for ever having a backup set of 357 files.

I think it is a timeout problem.

I played with the --timeout parameter of duplicity but without success.

I think, the problem is, that in httplib.py method _read_status on class HTTPResponse reads a line from a filepointer which is mapped to the socket connection to the server and this readline command returns nothing.
Maybe it would be necessary to wait a little bit longer for the response of the server.
I am not a python programmer and my simple approach to set self.timeout on class HTTPConnection to 300 and 3000 didn't help.

Maybe one of the developers has a good idea ...

Chris

------------------------------
Here is what I get with httplib.py debugging on for small amounts of files:

... stuff deleted ....
Local and Remote metadata are synchronized, no sync needed.
send: 'PROPFIND /backup/bilbo/ HTTP/1.1\r\nHost: sd2dav.1und1.de\r\nAccept-Encod
ing: identity\r\nContent-Length: 96\r\nConnection: keep-alive\r\nDepth: 1\r\nAut
horization: Basic ..something...\r\n\r
\n'
send: '<?xml version="1.0" encoding="utf-8" ?>\n<D:propfind xmlns:D="DAV:">\n<D:
allprop/>\n</D:propfind>\n\n'
reply: 'HTTP/1.1 207 Multi-Status\r\n'
header: Date: Sun, 28 Feb 2010 20:45:27 GMT
header: Server: Apache/2.0.63 (Unix) mod_ssl/2.0.63 OpenSSL/0.9.7d DAV/2 Catacom
b/static
header: Content-Type: text/xml; charset="utf-8"
header: Vary: Accept-Encoding
header: Content-Length: 15186
header: Keep-Alive: timeout=1, max=99
header: Connection: Keep-Alive

----------------------------------
for many files I get simply:

... stuff deleted ....
Local and Remote metadata are synchronized, no sync needed.
send: 'PROPFIND /backup/bilbo/ HTTP/1.1\r\nHost: sd2dav.1und1.de\r\nAccept-Encod
ing: identity\r\nContent-Length: 96\r\nConnection: keep-alive\r\nDepth: 1\r\nAut
horization: Basic ..something...\r\n\r
\n'
send: '<?xml version="1.0" encoding="utf-8" ?>\n<D:propfind xmlns:D="DAV:">\n<D:
allprop/>\n</D:propfind>\n\n'
reply: ''

+ the traceback from above.

Revision history for this message
Wolfram Riedel (taisto-web) wrote :

Traceback (most recent call last):
  File "/usr/bin/duplicity", line 1239, in <module>
    with_tempdir(main)
  File "/usr/bin/duplicity", line 1232, in with_tempdir
    fn()
  File "/usr/bin/duplicity", line 1137, in main
    globals.archive_dir).set_values()
  File "/usr/lib/python2.6/dist-packages/duplicity/collections.py", line 658, in set_values
    backend_filename_list = self.backend.list()
  File "/usr/lib/python2.6/dist-packages/duplicity/backends/webdavbackend.py", line 165, in list
    response = self.request("PROPFIND", self.directory, self.listbody)
  File "/usr/lib/python2.6/dist-packages/duplicity/backends/webdavbackend.py", line 107, in request
    response = self.conn.getresponse()
  File "/usr/lib/python2.6/httplib.py", line 974, in getresponse
    response.begin()
  File "/usr/lib/python2.6/httplib.py", line 391, in begin
    version, status, reason = self._read_status()
  File "/usr/lib/python2.6/httplib.py", line 355, in _read_status
    raise BadStatusLine(line)
BadStatusLine
---
with duplicity 0.6.08-0ubuntu0karmic1, using web.de smartdisk webdav

Revision history for this message
Michael Trunner (trunneml) wrote :

Same here with Debian stable/backport. I use duply to controll duplicity. With duplicity 0.6.08b from backports I get these error messages (Client and Apache side):

Trackback:

Traceback (most recent call last):
  File "/usr/bin/duplicity", line 1251, in <module>
    with_tempdir(main)
  File "/usr/bin/duplicity", line 1244, in with_tempdir
    fn()
  File "/usr/bin/duplicity", line 1217, in main
    full_backup(col_stats)
  File "/usr/bin/duplicity", line 416, in full_backup
    globals.backend)
  File "/usr/bin/duplicity", line 315, in write_multivol
    (tdp, dest_filename)))
  File "/usr/lib/python2.5/site-packages/duplicity/asyncscheduler.py", line 145, in schedule_task
    return self.__run_synchronously(fn, params)
  File "/usr/lib/python2.5/site-packages/duplicity/asyncscheduler.py", line 171, in __run_synchronously
    ret = fn(*params)
  File "/usr/bin/duplicity", line 314, in <lambda>
    async_waiters.append(io_scheduler.schedule_task(lambda tdp, dest_filename: put(tdp, dest_filename),
  File "/usr/bin/duplicity", line 240, in put
    backend.put(tdp, dest_filename)
  File "/usr/lib/python2.5/site-packages/duplicity/backends/webdavbackend.py", line 251, in put
    response = self.request("PUT", url, source_file.read())
  File "/usr/lib/python2.5/site-packages/duplicity/backends/webdavbackend.py", line 107, in request
    response = self.conn.getresponse()
  File "/usr/lib/python2.5/httplib.py", line 928, in getresponse
    response.begin()
  File "/usr/lib/python2.5/httplib.py", line 385, in begin
    version, status, reason = self._read_status()
  File "/usr/lib/python2.5/httplib.py", line 349, in _read_status
    raise BadStatusLine(line)
BadStatusLine

Apache2 error log has no error message!
Apache2 access log has this:

188.40.48.x - backupuser [06/Sep/2010:11:04:37 +0200] "PUT /backup/duplicity-full.20100906T085526Z.vol17.difftar.gpg HTTP/1.1" 201 421 "-" "-"
188.40.48.x - backupuser [06/Sep/2010:11:04:42 +0200] "PUT /backup/duplicity-full.20100906T085526Z.vol18.difftar.gpg HTTP/1.1" 201 421 "-" "-"
188.40.48.x - backupuser [06/Sep/2010:11:04:56 +0200] "PUT /backup/duplicity-full.20100906T085526Z.vol19.difftar.gpg HTTP/1.1" 201 421 "-" "-"

The backup works without any problem with duplicity 0.4.11-2 from debian stable. The backport version (duplicity 0.6.08b-1~bpo50+2) won't work.

Revision history for this message
Martin Stelzer (the-assassin) wrote :

Synchronizing remote metadata to local cache...
GnuPG passphrase:
Copying duplicity-full-signatures.20110218T204037Z.sigtar to local cache.
Traceback (most recent call last):
  File "/opt/bin/duplicity", line 1245, in <module>
    with_tempdir(main)
  File "/opt/bin/duplicity", line 1238, in with_tempdir
    fn()
  File "/opt/bin/duplicity", line 1139, in main
    sync_archive()
  File "/opt/bin/duplicity", line 953, in sync_archive
    copy_to_local(fn)
  File "/opt/bin/duplicity", line 905, in copy_to_local
    fileobj = globals.backend.get_fileobj_read(rem_name)
  File "/opt/lib/python2.6/site-packages/duplicity/backend.py", line 475, in get
_fileobj_read
    self.get(filename, tdp)
  File "/opt/lib/python2.6/site-packages/duplicity/backends/webdavbackend.py", l
ine 233, in get
    response = self.request("GET", url)
  File "/opt/lib/python2.6/site-packages/duplicity/backends/webdavbackend.py", l
ine 107, in request
    response = self.conn.getresponse()
  File "/opt/lib/python2.6/httplib.py", line 990, in getresponse
    response.begin()
  File "/opt/lib/python2.6/httplib.py", line 391, in begin
    version, status, reason = self._read_status()
  File "/opt/lib/python2.6/httplib.py", line 355, in _read_status
    raise BadStatusLine(line)
BadStatusLine
------
with duplicity 0.6.11 on DS211 using sd2dav.1und1.de

Revision history for this message
mogliii (mogliii) wrote :

I get the same error with Strato HiDrive and webdavs (BadStatusLine).

Switched to ftp (they offer many different means of connecting to the storage).

Revision history for this message
mogliii (mogliii) wrote :

I forgot to add: duplicity 0.6.18

Revision history for this message
mogliii (mogliii) wrote :

This bug (https://bugs.launchpad.net/duplicity/+bug/811832) seems to be a duplicate.

summary: - BadStatusLine when trying to use webdavs (0.6.06)
+ BadStatusLine when trying to use webdavs
Revision history for this message
Dmitry "Harbinger" (doctorlans) wrote :

I've got the same bug when tryin to connect to Yandex Disk cloud via WebDAV.

$ duplicity ~/Upload/Backups/Test1/ webdavs://login@<email address hidden>/backups/test1
Traceback (most recent call last):
  File "/usr/bin/duplicity", line 1403, in <module>
    with_tempdir(main)
  File "/usr/bin/duplicity", line 1396, in with_tempdir
    fn()
  File "/usr/bin/duplicity", line 1272, in main
    sync_archive(decrypt)
  File "/usr/bin/duplicity", line 1024, in sync_archive
    remlist = globals.backend.list()
  File "/usr/lib/python2.7/dist-packages/duplicity/backends/webdavbackend.py", line 166, in list
    response = self.request("PROPFIND", self.directory, self.listbody)
  File "/usr/lib/python2.7/dist-packages/duplicity/backends/webdavbackend.py", line 108, in request
    response = self.conn.getresponse()
  File "/usr/lib/python2.7/httplib.py", line 1030, in getresponse
    response.begin()
  File "/usr/lib/python2.7/httplib.py", line 407, in begin
    version, status, reason = self._read_status()
  File "/usr/lib/python2.7/httplib.py", line 371, in _read_status
    raise BadStatusLine(line)
BadStatusLine: ''

Revision history for this message
Dmitry "Harbinger" (doctorlans) wrote :

Forgot: duplicity 0.6.18

Revision history for this message
Rolf Glei (rogle-le-deactivatedaccount) wrote :

Hello,

I've changed the provider: now I have no problems in using duplicity 0.6.19.

May be You will play a little with some switches in .davfs/davfs2.conf

For me is working:

use_locks 0
use_expect100 1
cache_size 20 # MiByte
table_size 4096
dir_refresh 30 # seconds
delay_upload 1

Have a maximum success!

Revision history for this message
Dmitry "Harbinger" (doctorlans) wrote :

I have no davfs installed. Is it neccessary?

Revision history for this message
Rolf Glei (rogle-le-deactivatedaccount) wrote :

Good question: it seems: no.

But if You have davfs or davfs2 You may mount Your cloud-storage as a local file. Now You can check, whether line or compatibility will work.

May be You check parameter '--num-retries number' where number will must be tried...

But it is a hard way to do... Sorry, can't give more help.

Changed in duplicity:
status: New → Fix Released
To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Duplicates of this bug

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.