Detach volume from instance using the Compute API got HTTP 409

Bug #2019888 reported by Jeffrey Chang
This bug report is a duplicate of:  Bug #2020111: CVE-2023-2088 regressions. Edit Remove
50
This bug affects 11 people
Affects Status Importance Assigned to Milestone
OpenStack Nova Compute Charm
New
Undecided
Unassigned

Bug Description

Solutions QA noticed 3 tempest failures with "[Detach volume from instance dd0deef9-d340-40d3-a0c7-fc527bbc0360 using the Compute API (HTTP 409)" in nova compute logs, from
https://solutions.qa.canonical.com/v2/testruns/88316a8c-4c9f-4661-8b67-b2feb448df34

1) tempest.api.compute.servers.test_delete_server.DeleteServersTestJSON.test_delete_server_while_in_attached_volume

Traceback (most recent call last):
  File "/home/ubuntu/snap/fcbtest/43/.rally/verification/verifier-139f735d-d56e-4087-ab22-e6c4e516f1c5/repo/tempest/common/utils/__init__.py", line 70, in wrapper
    return f(*func_args, **func_kwargs)
  File "/home/ubuntu/snap/fcbtest/43/.rally/verification/verifier-139f735d-d56e-4087-ab22-e6c4e516f1c5/repo/tempest/api/compute/servers/test_delete_server.py", line 119, in test_delete_server_while_in_attached_volume
    waiters.wait_for_volume_resource_status(self.volumes_client,
  File "/home/ubuntu/snap/fcbtest/43/.rally/verification/verifier-139f735d-d56e-4087-ab22-e6c4e516f1c5/repo/tempest/common/waiters.py", line 346, in wait_for_volume_resource_status
    raise lib_exc.TimeoutException(message)
tempest.lib.exceptions.TimeoutException: Request timed out
Details: volume a692e177-11e2-4e95-a05d-05b2e1173f5d failed to reach available status (current in-use) within the required time (600 s).

Nova compute log

2023-05-15 21:35:27.534 1895423 INFO nova.compute.manager [req-b6967a9e-3536-4fab-8a72-4906658b49a6 3e175fe33f424a96b583ae1b8b20fd15 6ff93a66db044c1e97625c93d7e4fd8a - b8c9dcb779504cc18eeab1260268f2e7 b8c9dcb779504cc18eeab1260268f2e7] [instance: 5909deba-6e90-4548-a5fa-c33945fa69b3] Attaching volume a692e177-11e2-4e95-a05d-05b2e1173f5d to /dev/vdb
2023-05-15 21:35:27.592 1895423 INFO oslo.privsep.daemon [req-b6967a9e-3536-4fab-8a72-4906658b49a6 3e175fe33f424a96b583ae1b8b20fd15 6ff93a66db044c1e97625c93d7e4fd8a - b8c9dcb779504cc18eeab1260268f2e7 b8c9dcb779504cc18eeab1260268f2e7] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--privsep_context', 'os_brick.privileged.default', '--privsep_sock_path', '/tmp/tmpn7eonbte/privsep.sock']
2023-05-15 21:35:27.675 1895423 INFO nova.compute.manager [-] [instance: df9d9d15-b84b-457a-b97d-c1676d46ab97] VM Stopped (Lifecycle Event)
2023-05-15 21:35:28.035 1895423 WARNING oslo.privsep.daemon [-] privsep log: Deprecated: Option "logdir" from group "DEFAULT" is deprecated. Use option "log-dir" from group "DEFAULT".
2023-05-15 21:35:28.141 1895423 INFO oslo.privsep.daemon [req-b6967a9e-3536-4fab-8a72-4906658b49a6 3e175fe33f424a96b583ae1b8b20fd15 6ff93a66db044c1e97625c93d7e4fd8a - b8c9dcb779504cc18eeab1260268f2e7 b8c9dcb779504cc18eeab1260268f2e7] Spawned new privsep daemon via rootwrap
2023-05-15 21:35:28.057 326132 INFO oslo.privsep.daemon [-] privsep daemon starting
2023-05-15 21:35:28.060 326132 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
2023-05-15 21:35:28.061 326132 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
2023-05-15 21:35:28.061 326132 INFO oslo.privsep.daemon [-] privsep daemon running as pid 326132
2023-05-15 21:35:28.271 1895423 WARNING os_brick.initiator.connectors.nvmeof [req-b6967a9e-3536-4fab-8a72-4906658b49a6 3e175fe33f424a96b583ae1b8b20fd15 6ff93a66db044c1e97625c93d7e4fd8a - b8c9dcb779504cc18eeab1260268f2e7 b8c9dcb779504cc18eeab1260268f2e7] Process execution error in _get_host_uuid: [Errno 13] Permission denied
Command: blkid /dev/sda4 -s UUID -o value
Exit code: -
Stdout: None
Stderr: None: oslo_concurrency.processutils.ProcessExecutionError: [Errno 13] Permission denied
2023-05-15 21:35:28.284 1895423 WARNING os_brick.initiator.connectors.nvmeof [req-b6967a9e-3536-4fab-8a72-4906658b49a6 3e175fe33f424a96b583ae1b8b20fd15 6ff93a66db044c1e97625c93d7e4fd8a - b8c9dcb779504cc18eeab1260268f2e7 b8c9dcb779504cc18eeab1260268f2e7] Unknown error when checking presence of nvme: [Errno 13] Permission denied: 'nvme': PermissionError: [Errno 13] Permission denied: 'nvme'
2023-05-15 21:35:28.288 326132 WARNING os_brick.privileged.nvmeof [-] Could not generate host nqn: [Errno 13] Permission denied: 'nvme'

2023-05-15 21:35:32.421 1895423 WARNING nova.compute.manager [req-61d23c68-6068-4bc0-93ba-aed8e7889714 3e175fe33f424a96b583ae1b8b20fd15 6ff93a66db044c1e97625c93d7e4fd8a - b8c9dcb779504cc18eeab1260268f2e7 b8c9dcb779504cc18eeab1260268f2e7] [instance: 5909deba-6e90-4548-a5fa-c33945fa69b3] Ignoring unknown cinder exception for volume a692e177-11e2-4e95-a05d-05b2e1173f5d: ConflictNovaUsingAttachment: Detach volume from instance 5909deba-6e90-4548-a5fa-c33945fa69b3 using the Compute API (HTTP 409) (Request-ID: req-6e5eb7a4-fd98-4e07-9f0f-cd70e8805fd8): cinderclient.exceptions.ClientException: ConflictNovaUsingAttachment: Detach volume from instance 5909deba-6e90-4548-a5fa-c33945fa69b3 using the Compute API (HTTP 409) (Request-ID: req-6e5eb7a4-fd98-4e07-9f0f-cd70e8805fd8)

2) tempest.api.compute.volumes.test_attach_volume.AttachVolumeTestJSON.test_attach_detach_volume

Traceback (most recent call last):
  File "/home/ubuntu/snap/fcbtest/43/.rally/verification/verifier-139f735d-d56e-4087-ab22-e6c4e516f1c5/repo/tempest/api/compute/volumes/test_attach_volume.py", line 115, in test_attach_detach_volume
    waiters.wait_for_volume_resource_status(
  File "/home/ubuntu/snap/fcbtest/43/.rally/verification/verifier-139f735d-d56e-4087-ab22-e6c4e516f1c5/repo/tempest/common/waiters.py", line 346, in wait_for_volume_resource_status
    raise lib_exc.TimeoutException(message)
tempest.lib.exceptions.TimeoutException: Request timed out
Details: volume c5f92a81-f90b-49bf-bfad-0ef89929d71e failed to reach available status (current detaching) within the required time (600 s).

Nova compute log

2023-05-15 21:40:13.088 1036177 INFO nova.compute.manager [req-28ba4775-05bb-4b93-a8e3-218adf37506e fc86ae53ec26420982e90011f910dd68 c3b5e24518d84f1394a3d3f82923092d - b8c9dcb779504cc18eeab1260268f2e7 b8c9dcb779504cc18eeab1260268f2e7] [instance: e1f02ece-fa89-4624-b0bd-03d3ac187357
] Detaching volume c5f92a81-f90b-49bf-bfad-0ef89929d71e
2023-05-15 21:40:13.165 1036177 INFO nova.virt.block_device [req-28ba4775-05bb-4b93-a8e3-218adf37506e fc86ae53ec26420982e90011f910dd68 c3b5e24518d84f1394a3d3f82923092d - b8c9dcb779504cc18eeab1260268f2e7 b8c9dcb779504cc18eeab1260268f2e7] [instance: e1f02ece-fa89-4624-b0bd-03d3ac1873
57] Attempting to driver detach volume c5f92a81-f90b-49bf-bfad-0ef89929d71e from mountpoint /dev/vdb
2023-05-15 21:40:13.207 1036177 INFO nova.virt.libvirt.driver [req-28ba4775-05bb-4b93-a8e3-218adf37506e fc86ae53ec26420982e90011f910dd68 c3b5e24518d84f1394a3d3f82923092d - b8c9dcb779504cc18eeab1260268f2e7 b8c9dcb779504cc18eeab1260268f2e7] Successfully detached device vdb from insta
nce e1f02ece-fa89-4624-b0bd-03d3ac187357 from the persistent domain config.
2023-05-15 21:40:13.404 1036177 INFO nova.virt.libvirt.driver [req-28ba4775-05bb-4b93-a8e3-218adf37506e fc86ae53ec26420982e90011f910dd68 c3b5e24518d84f1394a3d3f82923092d - b8c9dcb779504cc18eeab1260268f2e7 b8c9dcb779504cc18eeab1260268f2e7] Successfully detached device vdb from insta
nce e1f02ece-fa89-4624-b0bd-03d3ac187357 from the live domain config.
2023-05-15 21:40:13.683 1036177 ERROR nova.volume.cinder [req-28ba4775-05bb-4b93-a8e3-218adf37506e fc86ae53ec26420982e90011f910dd68 c3b5e24518d84f1394a3d3f82923092d - b8c9dcb779504cc18eeab1260268f2e7 b8c9dcb779504cc18eeab1260268f2e7] Delete attachment failed for attachment 8bb134d9
-853f-4c79-a9dc-148646742999. Error: ConflictNovaUsingAttachment: Detach volume from instance e1f02ece-fa89-4624-b0bd-03d3ac187357 using the Compute API (HTTP 409) (Request-ID: req-4f64f96d-c73b-400f-93e8-0e62893f3347) Code: 409: cinderclient.exceptions.ClientException: ConflictNov
aUsingAttachment: Detach volume from instance e1f02ece-fa89-4624-b0bd-03d3ac187357 using the Compute API (HTTP 409) (Request-ID: req-4f64f96d-c73b-400f-93e8-0e62893f3347)
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server [req-28ba4775-05bb-4b93-a8e3-218adf37506e fc86ae53ec26420982e90011f910dd68 c3b5e24518d84f1394a3d3f82923092d - b8c9dcb779504cc18eeab1260268f2e7 b8c9dcb779504cc18eeab1260268f2e7] Exception during message handling: cinder
client.exceptions.ClientException: ConflictNovaUsingAttachment: Detach volume from instance e1f02ece-fa89-4624-b0bd-03d3ac187357 using the Compute API (HTTP 409) (Request-ID: req-4f64f96d-c73b-400f-93e8-0e62893f3347)
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message)
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args)
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args)
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/nova/exception_wrapper.py", line 71, in wrapped
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server _emit_versioned_exception_notification(
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/oslo_utils/excutils.py", line 227, in __exit__
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server self.force_reraise()
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server raise self.value
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/nova/exception_wrapper.py", line 63, in wrapped
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw)
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/nova/compute/utils.py", line 1439, in decorated_function
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs)
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/nova/compute/manager.py", line 212, in decorated_function
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server compute_utils.add_instance_fault_from_exc(context,
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/oslo_utils/excutils.py", line 227, in __exit__
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server self.force_reraise()
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server raise self.value
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/nova/compute/manager.py", line 201, in decorated_function
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs)
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/nova/compute/manager.py", line 7311, in detach_volume
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server do_detach_volume(context, volume_id, instance, attachment_id)
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py", line 391, in inner
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server return f(*args, **kwargs)
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/nova/compute/manager.py", line 7308, in do_detach_volume
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server self._detach_volume(context, bdm, instance,
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/nova/compute/manager.py", line 7259, in _detach_volume
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server driver_bdm.detach(context, instance, self.volume_api, self.driver,
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/nova/virt/block_device.py", line 473, in detach
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server self._do_detach(context, instance, volume_api, virt_driver,
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/nova/virt/block_device.py", line 454, in _do_detach
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server volume_api.attachment_delete(context, self['attachment_id'])
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/nova/volume/cinder.py", line 395, in wrapper
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server res = method(self, ctx, *args, **kwargs)
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/nova/volume/cinder.py", line 449, in wrapper
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server res = method(self, ctx, attachment_id, *args, **kwargs)
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/retrying.py", line 49, in wrapped_f
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server return Retrying(*dargs, **dkw).call(f, *args, **kw)
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/retrying.py", line 206, in call
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server return attempt.get(self._wrap_exception)
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/retrying.py", line 247, in get
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server six.reraise(self.value[0], self.value[1], self.value[2])
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/six.py", line 703, in reraise
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server raise value
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/retrying.py", line 200, in call
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server attempt = Attempt(fn(*args, **kwargs), attempt_number, False)
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/nova/volume/cinder.py", line 903, in attachment_delete
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server LOG.error('Delete attachment failed for attachment '
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/oslo_utils/excutils.py", line 227, in __exit__
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server self.force_reraise()
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server raise self.value
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/nova/volume/cinder.py", line 894, in attachment_delete
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server cinderclient(
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/cinderclient/api_versions.py", line 421, in substitution
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server return method.func(obj, *args, **kwargs)
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/cinderclient/v3/attachments.py", line 45, in delete
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server return self._delete("/attachments/%s" % base.getid(attachment))
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/cinderclient/base.py", line 313, in _delete
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server resp, body = self.api.client.delete(url)
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/cinderclient/client.py", line 233, in delete
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server return self._cs_request(url, 'DELETE', **kwargs)
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/cinderclient/client.py", line 215, in _cs_request
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server return self.request(url, method, **kwargs)
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/cinderclient/client.py", line 201, in request
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server raise exceptions.from_response(resp, body)
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server cinderclient.exceptions.ClientException: ConflictNovaUsingAttachment: Detach volume from instance e1f02ece-fa89-4624-b0bd-03d3ac187357 using the Compute API (HTTP 409) (Request-ID: req-4f64f96d-c73b-400f-93e8-0e62893f3347)
2023-05-15 21:40:13.764 1036177 ERROR oslo_messaging.rpc.server
2023-05-15 21:50:56.312 1036177 INFO nova.virt.libvirt.imagecache [req-cc9c0022-d9d8-47cb-b139-02cbb60fcb6f - - - - -] image ed89048d-4668-426c-9681-213ef76bbb05 at (/var/lib/nova/instances/_base/01d484c47de824e600e7591cdee04320739a7b15): checking
2023-05-15 21:50:56.372 1036177 WARNING nova.virt.libvirt.imagecache [req-cc9c0022-d9d8-47cb-b139-02cbb60fcb6f - - - - -] Unknown base file: /var/lib/nova/instances/_base/b18dbeae60c7f0d6dbbc699ed4b5c900b933a3cb
2023-05-15 21:50:56.373 1036177 INFO nova.virt.libvirt.imagecache [req-cc9c0022-d9d8-47cb-b139-02cbb60fcb6f - - - - -] Active base files: /var/lib/nova/instances/_base/01d484c47de824e600e7591cdee04320739a7b15
2023-05-15 21:50:56.373 1036177 INFO nova.virt.libvirt.imagecache [req-cc9c0022-d9d8-47cb-b139-02cbb60fcb6f - - - - -] Removable base files: /var/lib/nova/instances/_base/b18dbeae60c7f0d6dbbc699ed4b5c900b933a3cb
2023-05-15 21:50:56.374 1036177 INFO nova.virt.libvirt.imagecache [req-cc9c0022-d9d8-47cb-b139-02cbb60fcb6f - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/b18dbeae60c7f0d6dbbc699ed4b5c900b933a3cb
2023-05-15 22:00:14.083 1036177 INFO nova.compute.manager [req-f8e01879-f4b1-48ec-8470-4de29b05e34d fc86ae53ec26420982e90011f910dd68 c3b5e24518d84f1394a3d3f82923092d - b8c9dcb779504cc18eeab1260268f2e7 b8c9dcb779504cc18eeab1260268f2e7] [instance: e1f02ece-fa89-4624-b0bd-03d3ac187357] Get console output
2023-05-15 22:00:14.085 1237432 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
2023-05-15 22:00:45.309 1036177 INFO nova.compute.manager [req-cc9c0022-d9d8-47cb-b139-02cbb60fcb6f - - - - -] Running instance usage audit for host solqa-lab1-server-08.nosilo.lab1.solutionsqa from 2023-05-15 21:00:00 to 2023-05-15 22:00:00. 6 instances.
2023-05-15 22:10:15.297 1036177 INFO nova.compute.manager [req-ffdf985f-34a3-4dfb-916b-137f5fd01342 fc86ae53ec26420982e90011f910dd68 c3b5e24518d84f1394a3d3f82923092d - b8c9dcb779504cc18eeab1260268f2e7 b8c9dcb779504cc18eeab1260268f2e7] [instance: e1f02ece-fa89-4624-b0bd-03d3ac187357] Terminating instance
2023-05-15 22:10:16.063 1036177 INFO nova.virt.libvirt.driver [-] [instance: e1f02ece-fa89-4624-b0bd-03d3ac187357] Instance destroyed successfully.
2023-05-15 22:10:16.078 1036177 INFO os_vif [req-ffdf985f-34a3-4dfb-916b-137f5fd01342 fc86ae53ec26420982e90011f910dd68 c3b5e24518d84f1394a3d3f82923092d - b8c9dcb779504cc18eeab1260268f2e7 b8c9dcb779504cc18eeab1260268f2e7] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ad:f9:12,bridge_name='br-int',has_traffic_filtering=True,id=4f9622e2-48e5-47c7-9656-1c7824c50d90,network=Network(b74b0dfe-06e9-4bde-a56e-7b2cdad47088),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f9622e2-48')
2023-05-15 22:10:16.080 1036177 INFO nova.virt.libvirt.driver [req-ffdf985f-34a3-4dfb-916b-137f5fd01342 fc86ae53ec26420982e90011f910dd68 c3b5e24518d84f1394a3d3f82923092d - b8c9dcb779504cc18eeab1260268f2e7 b8c9dcb779504cc18eeab1260268f2e7] [instance: e1f02ece-fa89-4624-b0bd-03d3ac187357] Deleting instance files /var/lib/nova/instances/e1f02ece-fa89-4624-b0bd-03d3ac187357_del
2023-05-15 22:10:16.081 1036177 INFO nova.virt.libvirt.driver [req-ffdf985f-34a3-4dfb-916b-137f5fd01342 fc86ae53ec26420982e90011f910dd68 c3b5e24518d84f1394a3d3f82923092d - b8c9dcb779504cc18eeab1260268f2e7 b8c9dcb779504cc18eeab1260268f2e7] [instance: e1f02ece-fa89-4624-b0bd-03d3ac187357] Deletion of /var/lib/nova/instances/e1f02ece-fa89-4624-b0bd-03d3ac187357_del complete
2023-05-15 22:10:16.128 1036177 INFO nova.compute.manager [req-ffdf985f-34a3-4dfb-916b-137f5fd01342 fc86ae53ec26420982e90011f910dd68 c3b5e24518d84f1394a3d3f82923092d - b8c9dcb779504cc18eeab1260268f2e7 b8c9dcb779504cc18eeab1260268f2e7] [instance: e1f02ece-fa89-4624-b0bd-03d3ac187357] Took 0.83 seconds to destroy the instance on the hypervisor.
2023-05-15 22:10:17.152 1036177 INFO nova.compute.manager [-] [instance: e1f02ece-fa89-4624-b0bd-03d3ac187357] Took 1.02 seconds to deallocate network for instance.
2023-05-15 22:10:17.388 1036177 ERROR nova.volume.cinder [req-ffdf985f-34a3-4dfb-916b-137f5fd01342 fc86ae53ec26420982e90011f910dd68 c3b5e24518d84f1394a3d3f82923092d - b8c9dcb779504cc18eeab1260268f2e7 b8c9dcb779504cc18eeab1260268f2e7] Delete attachment failed for attachment 8bb134d9-853f-4c79-a9dc-148646742999. Error: ConflictNovaUsingAttachment: Detach volume from instance e1f02ece-fa89-4624-b0bd-03d3ac187357 using the Compute API (HTTP 409) (Request-ID: req-37e36c7c-41db-46ff-a727-0b56aa759135) Code: 409: cinderclient.exceptions.ClientException: ConflictNovaUsingAttachment: Detach volume from instance e1f02ece-fa89-4624-b0bd-03d3ac187357 using the Compute API (HTTP 409) (Request-ID: req-37e36c7c-41db-46ff-a727-0b56aa759135)
2023-05-15 22:10:17.388 1036177 WARNING nova.compute.manager [req-ffdf985f-34a3-4dfb-916b-137f5fd01342 fc86ae53ec26420982e90011f910dd68 c3b5e24518d84f1394a3d3f82923092d - b8c9dcb779504cc18eeab1260268f2e7 b8c9dcb779504cc18eeab1260268f2e7] [instance: e1f02ece-fa89-4624-b0bd-03d3ac187357] Ignoring unknown cinder exception for volume c5f92a81-f90b-49bf-bfad-0ef89929d71e: ConflictNovaUsingAttachment: Detach volume from instance e1f02ece-fa89-4624-b0bd-03d3ac187357 using the Compute API (HTTP 409) (Request-ID: req-37e36c7c-41db-46ff-a727-0b56aa759135): cinderclient.exceptions.ClientException: ConflictNovaUsingAttachment: Detach volume from instance e1f02ece-fa89-4624-b0bd-03d3ac187357 using the Compute API (HTTP 409) (Request-ID: req-37e36c7c-41db-46ff-a727-0b56aa759135)

3) tempest.api.compute.volumes.test_attach_volume.AttachVolumeTestJSON.test_list_get_volume_attachments

Traceback (most recent call last):
  File "/home/ubuntu/snap/fcbtest/43/.rally/verification/verifier-139f735d-d56e-4087-ab22-e6c4e516f1c5/repo/tempest/api/compute/volumes/test_attach_volume.py", line 181, in test_list_get_volume_attachments
    waiters.wait_for_volume_resource_status(
  File "/home/ubuntu/snap/fcbtest/43/.rally/verification/verifier-139f735d-d56e-4087-ab22-e6c4e516f1c5/repo/tempest/common/waiters.py", line 346, in wait_for_volume_resource_status
    raise lib_exc.TimeoutException(message)
tempest.lib.exceptions.TimeoutException: Request timed out
Details: volume b7ab5dc6-c554-4869-84e5-d1d650e8c649 failed to reach available status (current detaching) within the required time (600 s).

Nova compute log

2023-05-15 22:10:48.194 1586545 INFO nova.compute.manager [req-008a4179-9a88-42d6-bcea-cb6a35a7437b fc86ae53ec26420982e90011f910dd68 c3b5e24518d84f1394a3d3f82923092d - b8c9dcb779504cc18eeab1260268f2e7 b8c9dcb779504cc18eeab1260268f2e7] [instance: dd0deef9-d340-40d3-a0c7-fc527bbc0360
] Attaching volume b7ab5dc6-c554-4869-84e5-d1d650e8c649 to /dev/vdb
2023-05-15 22:10:48.253 1586545 WARNING os_brick.initiator.connectors.nvmeof [req-008a4179-9a88-42d6-bcea-cb6a35a7437b fc86ae53ec26420982e90011f910dd68 c3b5e24518d84f1394a3d3f82923092d - b8c9dcb779504cc18eeab1260268f2e7 b8c9dcb779504cc18eeab1260268f2e7] Process execution error in _
get_host_uuid: [Errno 13] Permission denied
Command: blkid /dev/sda4 -s UUID -o value
Exit code: -
Stdout: None
Stderr: None: oslo_concurrency.processutils.ProcessExecutionError: [Errno 13] Permission denied
2023-05-15 22:10:48.283 1586545 WARNING os_brick.initiator.connectors.nvmeof [req-008a4179-9a88-42d6-bcea-cb6a35a7437b fc86ae53ec26420982e90011f910dd68 c3b5e24518d84f1394a3d3f82923092d - b8c9dcb779504cc18eeab1260268f2e7 b8c9dcb779504cc18eeab1260268f2e7] Unknown error when checking
presence of nvme: [Errno 13] Permission denied: 'nvme': PermissionError: [Errno 13] Permission denied: 'nvme'
2023-05-15 22:10:48.289 1900199 WARNING os_brick.privileged.nvmeof [-] Could not generate host nqn: [Errno 13] Permission denied: 'nvme'
2023-05-15 22:10:51.432 1586545 INFO nova.compute.manager [req-4f47b299-a697-4114-8662-08c33607ae71 fc86ae53ec26420982e90011f910dd68 c3b5e24518d84f1394a3d3f82923092d - b8c9dcb779504cc18eeab1260268f2e7 b8c9dcb779504cc18eeab1260268f2e7] [instance: dd0deef9-d340-40d3-a0c7-fc527bbc0360
] Attaching volume 638f45df-f248-4824-8180-85fb401162ea to /dev/vdc
2023-05-15 22:10:51.501 1586545 WARNING os_brick.initiator.connectors.nvmeof [req-4f47b299-a697-4114-8662-08c33607ae71 fc86ae53ec26420982e90011f910dd68 c3b5e24518d84f1394a3d3f82923092d - b8c9dcb779504cc18eeab1260268f2e7 b8c9dcb779504cc18eeab1260268f2e7] Process execution error in _
get_host_uuid: [Errno 13] Permission denied
Command: blkid /dev/sda4 -s UUID -o value
Exit code: -
Stdout: None
Stderr: None: oslo_concurrency.processutils.ProcessExecutionError: [Errno 13] Permission denied
2023-05-15 22:10:51.522 1586545 WARNING os_brick.initiator.connectors.nvmeof [req-4f47b299-a697-4114-8662-08c33607ae71 fc86ae53ec26420982e90011f910dd68 c3b5e24518d84f1394a3d3f82923092d - b8c9dcb779504cc18eeab1260268f2e7 b8c9dcb779504cc18eeab1260268f2e7] Unknown error when checking
presence of nvme: [Errno 13] Permission denied: 'nvme': PermissionError: [Errno 13] Permission denied: 'nvme'
2023-05-15 22:10:51.529 1900199 WARNING os_brick.privileged.nvmeof [-] Could not generate host nqn: [Errno 13] Permission denied: 'nvme'
2023-05-15 22:10:53.221 1586545 INFO nova.compute.manager [req-45be54a6-d166-44af-b753-e5e3b41699b6 fc86ae53ec26420982e90011f910dd68 c3b5e24518d84f1394a3d3f82923092d - b8c9dcb779504cc18eeab1260268f2e7 b8c9dcb779504cc18eeab1260268f2e7] [instance: dd0deef9-d340-40d3-a0c7-fc527bbc0360
] Detaching volume b7ab5dc6-c554-4869-84e5-d1d650e8c649
2023-05-15 22:10:53.286 1586545 INFO nova.virt.block_device [req-45be54a6-d166-44af-b753-e5e3b41699b6 fc86ae53ec26420982e90011f910dd68 c3b5e24518d84f1394a3d3f82923092d - b8c9dcb779504cc18eeab1260268f2e7 b8c9dcb779504cc18eeab1260268f2e7] [instance: dd0deef9-d340-40d3-a0c7-fc527bbc03
60] Attempting to driver detach volume b7ab5dc6-c554-4869-84e5-d1d650e8c649 from mountpoint /dev/vdb
2023-05-15 22:10:53.297 1586545 INFO nova.virt.libvirt.driver [req-45be54a6-d166-44af-b753-e5e3b41699b6 fc86ae53ec26420982e90011f910dd68 c3b5e24518d84f1394a3d3f82923092d - b8c9dcb779504cc18eeab1260268f2e7 b8c9dcb779504cc18eeab1260268f2e7] Successfully detached device vdb from insta
nce dd0deef9-d340-40d3-a0c7-fc527bbc0360 from the persistent domain config.
2023-05-15 22:10:53.414 1586545 INFO nova.virt.libvirt.driver [req-45be54a6-d166-44af-b753-e5e3b41699b6 fc86ae53ec26420982e90011f910dd68 c3b5e24518d84f1394a3d3f82923092d - b8c9dcb779504cc18eeab1260268f2e7 b8c9dcb779504cc18eeab1260268f2e7] Successfully detached device vdb from insta
nce dd0deef9-d340-40d3-a0c7-fc527bbc0360 from the live domain config.
2023-05-15 22:10:53.643 1586545 ERROR nova.volume.cinder [req-45be54a6-d166-44af-b753-e5e3b41699b6 fc86ae53ec26420982e90011f910dd68 c3b5e24518d84f1394a3d3f82923092d - b8c9dcb779504cc18eeab1260268f2e7 b8c9dcb779504cc18eeab1260268f2e7] Delete attachment failed for attachment 5d089c1b
-0126-4893-a0f1-4010d2a34a5d. Error: ConflictNovaUsingAttachment: Detach volume from instance dd0deef9-d340-40d3-a0c7-fc527bbc0360 using the Compute API (HTTP 409) (Request-ID: req-16a7d49b-af9d-424e-bc68-7e10547ec45e) Code: 409: cinderclient.exceptions.ClientException: ConflictNov
aUsingAttachment: Detach volume from instance dd0deef9-d340-40d3-a0c7-fc527bbc0360 using the Compute API (HTTP 409) (Request-ID: req-16a7d49b-af9d-424e-bc68-7e10547ec45e)
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server [req-45be54a6-d166-44af-b753-e5e3b41699b6 fc86ae53ec26420982e90011f910dd68 c3b5e24518d84f1394a3d3f82923092d - b8c9dcb779504cc18eeab1260268f2e7 b8c9dcb779504cc18eeab1260268f2e7] Exception during message handling: cinder
client.exceptions.ClientException: ConflictNovaUsingAttachment: Detach volume from instance dd0deef9-d340-40d3-a0c7-fc527bbc0360 using the Compute API (HTTP 409) (Request-ID: req-16a7d49b-af9d-424e-bc68-7e10547ec45e)
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message)
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args)
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args)
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/nova/exception_wrapper.py", line 71, in wrapped
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server _emit_versioned_exception_notification(
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/oslo_utils/excutils.py", line 227, in __exit__
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server self.force_reraise()
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server raise self.value
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/nova/exception_wrapper.py", line 63, in wrapped
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw)
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/nova/compute/utils.py", line 1439, in decorated_function
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs)
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/nova/compute/manager.py", line 212, in decorated_function
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server compute_utils.add_instance_fault_from_exc(context,
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/oslo_utils/excutils.py", line 227, in __exit__
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server self.force_reraise()
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server raise self.value
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/nova/compute/manager.py", line 201, in decorated_function
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs)
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/nova/compute/manager.py", line 7311, in detach_volume
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server do_detach_volume(context, volume_id, instance, attachment_id)
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py", line 391, in inner
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server return f(*args, **kwargs)
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/nova/compute/manager.py", line 7308, in do_detach_volume
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server self._detach_volume(context, bdm, instance,
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/nova/compute/manager.py", line 7259, in _detach_volume
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server driver_bdm.detach(context, instance, self.volume_api, self.driver,
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/nova/virt/block_device.py", line 473, in detach
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server self._do_detach(context, instance, volume_api, virt_driver,
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/nova/virt/block_device.py", line 454, in _do_detach
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server volume_api.attachment_delete(context, self['attachment_id'])
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/nova/volume/cinder.py", line 395, in wrapper
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server res = method(self, ctx, *args, **kwargs)
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/nova/volume/cinder.py", line 449, in wrapper
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server res = method(self, ctx, attachment_id, *args, **kwargs)
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/retrying.py", line 49, in wrapped_f
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server return Retrying(*dargs, **dkw).call(f, *args, **kw)
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/retrying.py", line 206, in call
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server return attempt.get(self._wrap_exception)
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/retrying.py", line 247, in get
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server six.reraise(self.value[0], self.value[1], self.value[2])
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/six.py", line 703, in reraise
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server raise value
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/retrying.py", line 200, in call
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server attempt = Attempt(fn(*args, **kwargs), attempt_number, False)
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/nova/volume/cinder.py", line 903, in attachment_delete
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server LOG.error('Delete attachment failed for attachment '
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/oslo_utils/excutils.py", line 227, in __exit__
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server self.force_reraise()
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server raise self.value
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/nova/volume/cinder.py", line 894, in attachment_delete
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server cinderclient(
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/cinderclient/api_versions.py", line 421, in substitution
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server return method.func(obj, *args, **kwargs)
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/cinderclient/v3/attachments.py", line 45, in delete
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server return self._delete("/attachments/%s" % base.getid(attachment))
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/cinderclient/base.py", line 313, in _delete
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server resp, body = self.api.client.delete(url)
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/cinderclient/client.py", line 233, in delete
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server return self._cs_request(url, 'DELETE', **kwargs)
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/cinderclient/client.py", line 215, in _cs_request
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server return self.request(url, method, **kwargs)
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server File "/usr/lib/python3/dist-packages/cinderclient/client.py", line 201, in request
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server raise exceptions.from_response(resp, body)
2023-05-15 22:10:53.717 1586545 ERROR oslo_messaging.rpc.server cinderclient.exceptions.ClientException: ConflictNovaUsingAttachment: Detach volume from instance dd0deef9-d340-40d3-a0c7-fc527bbc0360 using the Compute API (HTTP 409) (Request-ID: req-16a7d49b-af9d-424e-bc68-7e10547ec45e)

CVE References

description: updated
description: updated
description: updated
affects: charm-nova-cloud-controller → charm-nova-compute
Revision history for this message
Jeffrey Chang (modern911) wrote :

We bumped into this bug 13 times since May 13, please see https://solutions.qa.canonical.com/v2/bugs/2019888 for trends.

Revision history for this message
Billy Olsen (billy-olsen) wrote :

I believe that this is related to the fallout of CVE-2023-2088. The bug fix is currently in the process of being reverted in https://bugs.launchpad.net/ubuntu/+source/nova/+bug/2020111.

Revision history for this message
Ing (turnoingenieria) wrote :

I have the same problem. In my case, i deployed Openstack this week (all softwares and firmwares update). Then install Masakari Host Monitor, i try evacuate a TEST instance (shutdown kvm from hardware server where the instance TEST run) and nova-compute do not create instance TEST in the other Server. In other the server, Nova-compute log the error "cinderclient.exceptions.ClientException: ConflictNovaUsingAttachment: Detach volume from instance d4bf9be8-0183-4697-aedb-f653f9a6a15b using the Compute API (HTTP 409)". I used shared Storage thoght FC to provide Storage to kvm nodes and the process the live migration work without problem. Any idea about this issue?

Revision history for this message
Boldbayar Jantsan (boldbayar) wrote :

The same problem has occurred. Is here anyone who fixed this error?

Revision history for this message
Francois Scheurer (scheuref) wrote :

Same issue here on cinder 21.3.0, even after configuring service-token / [service_user] in both direction nova<->cinder ... (cf. https://docs.openstack.org/cinder/latest/configuration/block-storage/service-token.html)

both commands make the volume stucked in detaching status:
    openstack server remove volume <server> <volume>
    nova volume-detach <server> <volume>
and this is also not helping:
    openstack --os-volume-api-version 3.27 volume attachment delete <vol-attachment>
ConflictNovaUsingAttachment: Detach volume from instance f16e324b-0af0-4218-8918-f8b3917fa44e using the Compute API (HTTP 409) (Request-ID: req-f7ff5add-b4ad-4bf4-a2c9-c20c134ef2f1)

it looks like all 3 requests came in cinder as user request without the service-token ?
if this is so by design, then that means cannot detach a volume anymore on a running VM at all?

Revision history for this message
Francois Scheurer (scheuref) wrote :

sorry ignore last comment, I should have noticed that this bug is a duplicate...

Revision history for this message
Vladimir Grujic (hyperbaba) wrote :

When those patches are going to land in the stable channels? I have deployed a charmed openstack yoga/stable platform which is still affected by those regressions.

To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.