sqlalchemy failures when using multi-backend

Bug #1430859 reported by Eric Harney
10
This bug affects 1 person
Affects Status Importance Assigned to Milestone
Cinder
New
Undecided
Unassigned

Bug Description

I'm encountering failures on master when configuring multi-backend. I hit this repeatedly at c-vol startup when configuring 4 rbd backends on Fedora 20.

Errors indicate various failures in sqlalchemy communicating with the SQL server. These are two common examples:

2015-03-11 11:25:05.361 TRACE cinder.openstack.common.threadgroup DBError: This result object does not return rows. It has been closed automatically.

2015-03-11 11:30:52.038 TRACE cinder DBConnectionError: (OperationalError) (2013, 'Lost connection to MySQL server during query')

From what I can tell, this was introduced by:
6879bd0 NFS Security Enhancements: allows secure NFS environment setup

The reason it breaks things is that before, we didn't use the database before passing self.db to import_object(volume_driver, ..., db=self.db, ...), so the driver must have gotten a new connection initialized. However, since we now use the database before that call, the volume manager and driver end up both trying to use the same sqlalchemy object/connection, which is not safe.

We need to ensure each backend gets a clean object/connection to work with.

Tags: db
Revision history for this message
Eric Harney (eharney) wrote :

To verify that the noted change causes this, just set "vol_db_empty = False" in VolumeManager's __init__ and see that the failure doesn't occur.

Revision history for this message
Mike Bayer (zzzeek) wrote :

this is a well known error which is described fully at https://bugs.launchpad.net/oslo.db/+bug/1417018. the fixes are known but to my knowledge there was still some debate over what steps to take.

Revision history for this message
Mike Bayer (zzzeek) wrote :

I'd like to mark as a dupe, Eric can you please confirm for me, thanks!

Revision history for this message
Eric Harney (eharney) wrote :

Oops, lost track of the other bug, thanks. :)

To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.