Cinder attachment delete API failure leaves volume in state 'detaching'

Bug #2016173 reported by melanie witt
10
This bug affects 2 people
Affects Status Importance Assigned to Milestone
OpenStack Compute (nova)
In Progress
Undecided
melanie witt

Bug Description

Reported by Gorka Eguileor of the Cinder team and confirmed in a local devstack:

During a volume detach, if Cinder API responds with an error when Nova calls the attachment delete API, the volume will be left in status 'detaching' instead of 'in-use'.

When the volume is in state 'detaching', any later detach attempts will be rejected because the state is not 'in-use'.

Repro steps:

1. Create a volume
openstack volume create

2. Create an instance
openstack server create

3. Attach the volume to the instance
openstack server add volume

4. Fake an error in the Cinder attachment delete API
(I just raised an exception in cinder/api/v3/attachments.py)

5. Try to detach the volume from the instance
openstack server remove volume

Expected result:

Detach fails and the volume remains in 'in-use' status

Actual result:

Detach fails and the volume is stuck in 'detaching' status

Nova calls the Cinder API to update the volume status to 'detaching' in compute/api but it does not call Cinder API to rollback the volume status from 'detaching' if the Cinder API attachment delete call returns an error.

I think we just need to call the rollback detaching Cinder API in this case to resolve the issue.

Tags: volumes
Revision history for this message
OpenStack Infra (hudson-openstack) wrote : Fix proposed to nova (master)

Fix proposed to branch: master
Review: https://review.opendev.org/c/openstack/nova/+/880399

Changed in nova:
status: Triaged → In Progress
Revision history for this message
Amit Uniyal (auniyal) wrote :

was able to reproduce by stoping devstack@c-vol service before running
$ openstack server remove volume

volume stayed at "detaching "

later on restarting devstack@c-vol service, vol became available by itself, without any intervention.

To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.