test_postgresql_opportunistically fails with "database "openstack_citest" is being accessed by other users"

Bug #1393633 reported by Matt Riedemann
22
This bug affects 3 people
Affects Status Importance Assigned to Milestone
OpenStack Compute (nova)
Fix Released
High
Viktor Serhieiev
oslo.db
Fix Released
High
Roman Podoliaka

Bug Description

Looks like this was previously fixed under bug 1328997 but this is back:

http://logs.openstack.org/72/135072/1/check/gate-nova-python27/ba44ca9/console.html#_2014-11-17_22_51_24_244

2014-11-17 22:51:24.244 | Captured traceback:
2014-11-17 22:51:24.244 | ~~~~~~~~~~~~~~~~~~~
2014-11-17 22:51:24.244 | Traceback (most recent call last):
2014-11-17 22:51:24.244 | File "nova/tests/unit/db/test_migrations.py", line 138, in test_postgresql_opportunistically
2014-11-17 22:51:24.245 | self._test_postgresql_opportunistically()
2014-11-17 22:51:24.245 | File "nova/tests/unit/db/test_migrations.py", line 429, in _test_postgresql_opportunistically
2014-11-17 22:51:24.245 | self._reset_database(database)
2014-11-17 22:51:24.245 | File "nova/tests/unit/db/test_migrations.py", line 336, in _reset_database
2014-11-17 22:51:24.245 | self._reset_pg(conn_pieces)
2014-11-17 22:51:24.245 | File "/home/jenkins/workspace/gate-nova-python27/.tox/py27/local/lib/python2.7/site-packages/oslo/concurrency/lockutils.py", line 311, in inner
2014-11-17 22:51:24.245 | return f(*args, **kwargs)
2014-11-17 22:51:24.245 | File "nova/tests/unit/db/test_migrations.py", line 245, in _reset_pg
2014-11-17 22:51:24.245 | self.execute_cmd(droptable)
2014-11-17 22:51:24.245 | File "nova/tests/unit/db/test_migrations.py", line 228, in execute_cmd
2014-11-17 22:51:24.245 | "Failed to run: %s\n%s" % (cmd, output))
2014-11-17 22:51:24.246 | File "/home/jenkins/workspace/gate-nova-python27/.tox/py27/local/lib/python2.7/site-packages/testtools/testcase.py", line 348, in assertEqual
2014-11-17 22:51:24.246 | self.assertThat(observed, matcher, message)
2014-11-17 22:51:24.246 | File "/home/jenkins/workspace/gate-nova-python27/.tox/py27/local/lib/python2.7/site-packages/testtools/testcase.py", line 433, in assertThat
2014-11-17 22:51:24.246 | raise mismatch_error
2014-11-17 22:51:24.246 | MismatchError: !=:
2014-11-17 22:51:24.246 | reference = ''
2014-11-17 22:51:24.246 | actual = u'''\
2014-11-17 22:51:24.246 | Unexpected error while running command.
2014-11-17 22:51:24.246 | Command: psql -w -U openstack_citest -h localhost -c 'drop database if exists openstack_citest;' -d postgres
2014-11-17 22:51:24.246 | Exit code: 1
2014-11-17 22:51:24.246 | Stdout: u''
2014-11-17 22:51:24.247 | Stderr: u'ERROR: database "openstack_citest" is being accessed by other users\\nDETAIL: There is 1 other session using the database.\\n\''''
2014-11-17 22:51:24.247 | : Failed to run: psql -w -U openstack_citest -h localhost -c 'drop database if exists openstack_citest;' -d postgres
2014-11-17 22:51:24.247 | Unexpected error while running command.
2014-11-17 22:51:24.247 | Command: psql -w -U openstack_citest -h localhost -c 'drop database if exists openstack_citest;' -d postgres
2014-11-17 22:51:24.247 | Exit code: 1
2014-11-17 22:51:24.247 | Stdout: u''
2014-11-17 22:51:24.247 | Stderr: u'ERROR: database "openstack_citest" is being accessed by other users\nDETAIL: There is 1 other session using the database.\n'
2014-11-17 22:51:24.247 | Traceback (most recent call last):
2014-11-17 22:51:24.247 | _StringException: Empty attachments:
2014-11-17 22:51:24.247 | pythonlogging:''
2014-11-17 22:51:24.247 | stderr
2014-11-17 22:51:24.248 | stdout

http://logstash.openstack.org/#eyJzZWFyY2giOiJtZXNzYWdlOlwiQ29tbWFuZDogcHNxbCAtdyAtVSBvcGVuc3RhY2tfY2l0ZXN0IC1oIGxvY2FsaG9zdCAtYyAnZHJvcCBkYXRhYmFzZSBpZiBleGlzdHMgb3BlbnN0YWNrX2NpdGVzdDsnIC1kIHBvc3RncmVzXCIgQU5EIHRhZ3M6XCJjb25zb2xlXCIgQU5EIGJ1aWxkX25hbWU6XCJnYXRlLW5vdmEtcHl0aG9uMjdcIiIsImZpZWxkcyI6W10sIm9mZnNldCI6MCwidGltZWZyYW1lIjoiNjA0ODAwIiwiZ3JhcGhtb2RlIjoiY291bnQiLCJ0aW1lIjp7InVzZXJfaW50ZXJ2YWwiOjB9LCJzdGFtcCI6MTQxNjI3NTg1MDI4MSwibW9kZSI6IiIsImFuYWx5emVfZmllbGQiOiIifQ==

516 hits in 7 days, check and gate, all failures.

Revision history for this message
Matt Riedemann (mriedem) wrote :

This showed up again on 11/17 and oslo.db 1.1.0 was just released today:

https://pypi.python.org/pypi/oslo.db/1.1.0

Changed in nova:
importance: Undecided → High
status: New → Confirmed
Revision history for this message
Matt Riedemann (mriedem) wrote :
Revision history for this message
Matt Riedemann (mriedem) wrote :

Maybe this helps? Not sure.

https://review.openstack.org/#/c/103920/

Revision history for this message
OpenStack Infra (hudson-openstack) wrote : Related fix proposed to nova (master)

Related fix proposed to branch: master
Review: https://review.openstack.org/135151

Revision history for this message
Roman Podoliaka (rpodolyaka) wrote :

Hmm, this is interesting.

I believe, bug 1328997 has little to do with this problem, as it was all about multiple connections to template1 db. Just looking at the commit logs and the changes we merged in 1.1.0 I can't see how they can possibly break existing tests. We'll take a closer look today.

Revision history for this message
Davanum Srinivas (DIMS) (dims-v) wrote :

Matt, i beat you by a whole hour :) https://bugs.launchpad.net/oslo.db/+bug/1393623

Changed in oslo.db:
status: New → Confirmed
assignee: nobody → Roman Podoliaka (rpodolyaka)
importance: Undecided → High
milestone: none → next-juno
status: Confirmed → In Progress
Revision history for this message
OpenStack Infra (hudson-openstack) wrote : Fix proposed to nova (master)

Fix proposed to branch: master
Review: https://review.openstack.org/135374

Changed in nova:
assignee: nobody → Victor Sergeyev (vsergeyev)
status: Confirmed → In Progress
Revision history for this message
OpenStack Infra (hudson-openstack) wrote : Fix merged to nova (master)

Reviewed: https://review.openstack.org/135374
Committed: https://git.openstack.org/cgit/openstack/nova/commit/?id=5c69b192895783022a80b42350a6059631b5d3f8
Submitter: Jenkins
Branch: master

commit 5c69b192895783022a80b42350a6059631b5d3f8
Author: Victor Sergeyev <email address hidden>
Date: Tue Nov 18 19:35:30 2014 +0200

    Add custom is_backend_avail() method

    Fast-and-dirty attempt to fix bug 1393633

    Closes-Bug: #1393633

    Change-Id: I2cc33ff4c6245e3e541222df60c0ca0a44b3d75a

Changed in nova:
status: In Progress → Fix Committed
Revision history for this message
OpenStack Infra (hudson-openstack) wrote : Change abandoned on nova (master)

Change abandoned by Matt Riedemann (<email address hidden>) on branch: master
Review: https://review.openstack.org/135151
Reason: A separate stop-gap was merged for now:

https://review.openstack.org/#/c/135374/

Revision history for this message
Qin Zhao (zhaoqin) wrote :

Also encounter this issue in stable/juno CI.

tags: added: juno-backport-potential
Revision history for this message
Balazs Gibizer (balazs-gibizer) wrote :
Revision history for this message
Alan Pevec (apevec) wrote :

> The final solution https://review.openstack.org/#/c/103920/19 that includes a bunch of refactoring too.

This was already backported by Adam in https://review.openstack.org/136538
and I'm tracking it on list for 2014.2.1 https://etherpad.openstack.org/p/StableJuno
(details on collateral changes required to make it pass in etherpad)

Revision history for this message
Viktor Serhieiev (vsergeyev) wrote :
Changed in oslo.db:
status: In Progress → Fix Committed
Changed in oslo.db:
status: Fix Committed → Fix Released
milestone: next-juno → 1.1.0
Thierry Carrez (ttx)
Changed in nova:
milestone: none → kilo-1
status: Fix Committed → Fix Released
Thierry Carrez (ttx)
Changed in nova:
milestone: kilo-1 → 2015.1.0
To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Duplicates of this bug

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.