Fail to delete the volume because 'unable to deactivate logical volume'

Bug #1302332 reported by Kiyohiro Adachi
10
This bug affects 2 people
Affects Status Importance Assigned to Milestone
Cinder
Invalid
Undecided
Unassigned
OpenStack Core Infrastructure
Incomplete
Undecided
Unassigned
tempest
Invalid
Undecided
Unassigned

Bug Description

http://logs.openstack.org/75/77075/2/gate/gate-tempest-dsvm-full/aece4c9/

screen-c-vol.txt.gz

2014-04-04 02:34:57.400 3105 ERROR oslo.messaging.rpc.dispatcher [req-c4825906-656a-44a1-ae3b-c4b723aba367 37d5dbcf717c4791aa9ac6ef3d11a8a0 3ee380054c7342449e10fb922ef93a90 - - -] Exception during message handling: Unexpected error while running command.
Command: sudo cinder-rootwrap /etc/cinder/rootwrap.conf lvremove -f stack-volumes/volume-a368b9e9-a8a6-404a-95d3-227f35c3b085
Exit code: 5
Stdout: ''
Stderr: ' LV stack-volumes/volume-a368b9e9-a8a6-404a-95d3-227f35c3b085 in use: not deactivating\n Unable to deactivate logical volume "volume-a368b9e9-a8a6-404a-95d3-227f35c3b085"\n'
2014-04-04 02:34:57.400 3105 TRACE oslo.messaging.rpc.dispatcher Traceback (most recent call last):
2014-04-04 02:34:57.400 3105 TRACE oslo.messaging.rpc.dispatcher File "/usr/local/lib/python2.7/dist-packages/oslo/messaging/rpc/dispatcher.py", line 133, in _dispatch_and_reply
2014-04-04 02:34:57.400 3105 TRACE oslo.messaging.rpc.dispatcher incoming.message))
2014-04-04 02:34:57.400 3105 TRACE oslo.messaging.rpc.dispatcher File "/usr/local/lib/python2.7/dist-packages/oslo/messaging/rpc/dispatcher.py", line 176, in _dispatch
2014-04-04 02:34:57.400 3105 TRACE oslo.messaging.rpc.dispatcher return self._do_dispatch(endpoint, method, ctxt, args)
2014-04-04 02:34:57.400 3105 TRACE oslo.messaging.rpc.dispatcher File "/usr/local/lib/python2.7/dist-packages/oslo/messaging/rpc/dispatcher.py", line 122, in _do_dispatch
2014-04-04 02:34:57.400 3105 TRACE oslo.messaging.rpc.dispatcher result = getattr(endpoint, method)(ctxt, **new_args)
2014-04-04 02:34:57.400 3105 TRACE oslo.messaging.rpc.dispatcher File "/opt/stack/new/cinder/cinder/volume/manager.py", line 144, in lvo_inner1
2014-04-04 02:34:57.400 3105 TRACE oslo.messaging.rpc.dispatcher return lvo_inner2(inst, context, volume_id, **kwargs)
2014-04-04 02:34:57.400 3105 TRACE oslo.messaging.rpc.dispatcher File "/opt/stack/new/cinder/cinder/openstack/common/lockutils.py", line 233, in inner
2014-04-04 02:34:57.400 3105 TRACE oslo.messaging.rpc.dispatcher retval = f(*args, **kwargs)
2014-04-04 02:34:57.400 3105 TRACE oslo.messaging.rpc.dispatcher File "/opt/stack/new/cinder/cinder/volume/manager.py", line 143, in lvo_inner2
2014-04-04 02:34:57.400 3105 TRACE oslo.messaging.rpc.dispatcher return f(*_args, **_kwargs)
2014-04-04 02:34:57.400 3105 TRACE oslo.messaging.rpc.dispatcher File "/opt/stack/new/cinder/cinder/volume/manager.py", line 416, in delete_volume
2014-04-04 02:34:57.400 3105 TRACE oslo.messaging.rpc.dispatcher {'status': 'error_deleting'})
2014-04-04 02:34:57.400 3105 TRACE oslo.messaging.rpc.dispatcher File "/opt/stack/new/cinder/cinder/openstack/common/excutils.py", line 68, in __exit__
2014-04-04 02:34:57.400 3105 TRACE oslo.messaging.rpc.dispatcher six.reraise(self.type_, self.value, self.tb)
2014-04-04 02:34:57.400 3105 TRACE oslo.messaging.rpc.dispatcher File "/opt/stack/new/cinder/cinder/volume/manager.py", line 405, in delete_volume
2014-04-04 02:34:57.400 3105 TRACE oslo.messaging.rpc.dispatcher self.driver.delete_volume(volume_ref)
2014-04-04 02:34:57.400 3105 TRACE oslo.messaging.rpc.dispatcher File "/opt/stack/new/cinder/cinder/volume/drivers/lvm.py", line 233, in delete_volume
2014-04-04 02:34:57.400 3105 TRACE oslo.messaging.rpc.dispatcher self._delete_volume(volume)
2014-04-04 02:34:57.400 3105 TRACE oslo.messaging.rpc.dispatcher File "/opt/stack/new/cinder/cinder/volume/drivers/lvm.py", line 133, in _delete_volume
2014-04-04 02:34:57.400 3105 TRACE oslo.messaging.rpc.dispatcher self.vg.delete(name)
2014-04-04 02:34:57.400 3105 TRACE oslo.messaging.rpc.dispatcher File "/opt/stack/new/cinder/cinder/brick/local_dev/lvm.py", line 610, in delete
2014-04-04 02:34:57.400 3105 TRACE oslo.messaging.rpc.dispatcher root_helper=self._root_helper, run_as_root=True)
2014-04-04 02:34:57.400 3105 TRACE oslo.messaging.rpc.dispatcher File "/opt/stack/new/cinder/cinder/utils.py", line 136, in execute
2014-04-04 02:34:57.400 3105 TRACE oslo.messaging.rpc.dispatcher return processutils.execute(*cmd, **kwargs)
2014-04-04 02:34:57.400 3105 TRACE oslo.messaging.rpc.dispatcher File "/opt/stack/new/cinder/cinder/openstack/common/processutils.py", line 173, in execute
2014-04-04 02:34:57.400 3105 TRACE oslo.messaging.rpc.dispatcher cmd=' '.join(cmd))
2014-04-04 02:34:57.400 3105 TRACE oslo.messaging.rpc.dispatcher ProcessExecutionError: Unexpected error while running command.
2014-04-04 02:34:57.400 3105 TRACE oslo.messaging.rpc.dispatcher Command: sudo cinder-rootwrap /etc/cinder/rootwrap.conf lvremove -f stack-volumes/volume-a368b9e9-a8a6-404a-95d3-227f35c3b085
2014-04-04 02:34:57.400 3105 TRACE oslo.messaging.rpc.dispatcher Exit code: 5
2014-04-04 02:34:57.400 3105 TRACE oslo.messaging.rpc.dispatcher Stdout: ''
2014-04-04 02:34:57.400 3105 TRACE oslo.messaging.rpc.dispatcher Stderr: ' LV stack-volumes/volume-a368b9e9-a8a6-404a-95d3-227f35c3b085 in use: not deactivating\n Unable to deactivate logical volume "volume-a368b9e9-a8a6-404a-95d3-227f35c3b085"\n'
2014-04-04 02:34:57.400 3105 TRACE oslo.messaging.rpc.dispatcher

Revision history for this message
Clark Boylan (cboylan) wrote :

This looks like a tempest test that failed because it tried to remove a cinder volume while that cinder volume was in use. This is either a bug in cinder (didn't stop using the volume before it attempted to delete it) or the tempest test (didn't stop using the volume before telling cinder to delete it). I do not think this is an infrastructure bug. If you have evidence to the contrary please update the bug and I can triage this further with regard to it being an infrastructure bug.

Changed in openstack-ci:
status: New → Incomplete
Revision history for this message
Ghanshyam Mann (ghanshyammann) wrote :

Not much trace call here.

But there could be same issue as in https://bugs.launchpad.net/cinder/+bug/1307394.

Might be cinder issue not transitioning the volume status within required time but need more log/trace to confirm that. Till then marking as incomplete from tempest side.

Changed in tempest:
status: New → Incomplete
Revision history for this message
Ivan Kolodyazhny (e0ne) wrote :
Revision history for this message
Yaroslav Lobankov (ylobankov) wrote :

It looks like it is not a Tempest issue and most likely the problem is somewhere in Cinder. Moving the bug status to "Invalid".

Changed in tempest:
status: Incomplete → Invalid
Revision history for this message
Sean McGinnis (sean-mcginnis) wrote :

Not sure if this was the same root cause as the mentioned bugs, but this has not been seen for some time. Marking as invalid. Please reopen if anyone is aware of this happening with the latest code.

Changed in cinder:
status: New → Invalid
To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.