Server delete times out due to "Cannot destroy instance, operation time out: libvirtError: Timed out during operation: cannot acquire state change lock (held by remoteDispatchDomainBlockJobAbort)"

Bug #1770786 reported by Matt Riedemann
34
This bug affects 5 people
Affects Status Importance Assigned to Milestone
OpenStack Compute (nova)
Confirmed
Medium
Unassigned

Bug Description

Seen in a CI run here:

http://logs.openstack.org/82/567682/1/check/nova-next/3920a7e/logs/screen-n-cpu.txt.gz?level=TRACE#_May_11_14_56_54_842884

May 11 14:56:54.842884 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: WARNING nova.virt.libvirt.driver [None req-1b69a74a-b7c5-4c29-af3d-c86a55ef7f4f service nova] [instance: 65a88c43-9287-408d-b9aa-861454939959] Cannot destroy instance, operation time out: libvirtError: Timed out during operation: cannot acquire state change lock (held by remoteDispatchDomainBlockJobAbort)
May 11 14:56:54.876958 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [None req-1b69a74a-b7c5-4c29-af3d-c86a55ef7f4f service nova] [instance: 65a88c43-9287-408d-b9aa-861454939959] Setting instance vm_state to ERROR: InstancePowerOffFailure: Failed to power off instance: operation time out
May 11 14:56:54.877131 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] Traceback (most recent call last):
May 11 14:56:54.877336 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] File "/opt/stack/new/nova/nova/compute/manager.py", line 2528, in do_terminate_instance
May 11 14:56:54.877482 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] self._delete_instance(context, instance, bdms)
May 11 14:56:54.877622 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] File "/opt/stack/new/nova/nova/hooks.py", line 154, in inner
May 11 14:56:54.877761 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] rv = f(*args, **kwargs)
May 11 14:56:54.877910 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] File "/opt/stack/new/nova/nova/compute/manager.py", line 2471, in _delete_instance
May 11 14:56:54.878050 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] self._shutdown_instance(context, instance, bdms)
May 11 14:56:54.878216 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] File "/opt/stack/new/nova/nova/compute/manager.py", line 2356, in _shutdown_instance
May 11 14:56:54.878365 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] pass
May 11 14:56:54.878505 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] File "/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 220, in __exit__
May 11 14:56:54.878642 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] self.force_reraise()
May 11 14:56:54.878792 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] File "/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 196, in force_reraise
May 11 14:56:54.878936 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] six.reraise(self.type_, self.value, self.tb)
May 11 14:56:54.879100 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] File "/opt/stack/new/nova/nova/compute/manager.py", line 2350, in _shutdown_instance
May 11 14:56:54.880077 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] block_device_info)
May 11 14:56:54.880297 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] File "/opt/stack/new/nova/nova/virt/libvirt/driver.py", line 930, in destroy
May 11 14:56:54.880488 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] self._destroy(instance)
May 11 14:56:54.880688 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] File "/opt/stack/new/nova/nova/virt/libvirt/driver.py", line 850, in _destroy
May 11 14:56:54.880878 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] raise exception.InstancePowerOffFailure(reason=reason)
May 11 14:56:54.881097 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] InstancePowerOffFailure: Failed to power off instance: operation time out
May 11 14:56:54.881322 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959]

This is only showing up in Rocky so probably due to using the newer libvirt version in the Queens UCA, which is version 4.0.0:

ii libvirt-bin 4.0.0-1ubuntu7~cloud0 amd64 programs for the libvirt library

http://logstash.openstack.org/#dashboard/file/logstash.json?query=message%3A%5C%22Cannot%20destroy%20instance%2C%20operation%20time%20out%3A%20libvirtError%3A%20Timed%20out%20during%20operation%3A%20cannot%20acquire%20state%20change%20lock%20(held%20by%20remoteDispatchDomainBlockJobAbort)%5C%22%20AND%20tags%3A%5C%22screen-n-cpu.txt%5C%22&from=7d

6 hits in 7 days, check and gate, all failures.

Tags: libvirt
Revision history for this message
Matt Riedemann (mriedem) wrote :
Download full text (4.7 KiB)

Also seen in a Train grenade job here:

http://logs.openstack.org/07/668707/1/gate/nova-grenade-live-migration/a07496c/logs/screen-n-cpu.txt.gz?level=TRACE#_Jul_08_17_55_21_915336

Jul 08 17:55:21.915336 ubuntu-bionic-rax-iad-0008752810 nova-compute[1203]: WARNING nova.virt.libvirt.driver [None req-7e57acd7-a044-4acd-96dc-ce5f91e32afd None None] [instance: 1d85e89c-7a1b-4b85-b831-77277f70c75a] Error monitoring migration: Timed out during operation: cannot acquire state change lock (held by remoteDispatchDomainMigratePerform3Params): libvirtError: Timed out during operation: cannot acquire state change lock (held by remoteDispatchDomainMigratePerform3Params)
Jul 08 17:55:21.915336 ubuntu-bionic-rax-iad-0008752810 nova-compute[1203]: ERROR nova.virt.libvirt.driver [instance: 1d85e89c-7a1b-4b85-b831-77277f70c75a] Traceback (most recent call last):
Jul 08 17:55:21.915336 ubuntu-bionic-rax-iad-0008752810 nova-compute[1203]: ERROR nova.virt.libvirt.driver [instance: 1d85e89c-7a1b-4b85-b831-77277f70c75a] File "/opt/stack/new/nova/nova/virt/libvirt/driver.py", line 8165, in _live_migration
Jul 08 17:55:21.915336 ubuntu-bionic-rax-iad-0008752810 nova-compute[1203]: ERROR nova.virt.libvirt.driver [instance: 1d85e89c-7a1b-4b85-b831-77277f70c75a] finish_event, disk_paths)
Jul 08 17:55:21.915336 ubuntu-bionic-rax-iad-0008752810 nova-compute[1203]: ERROR nova.virt.libvirt.driver [instance: 1d85e89c-7a1b-4b85-b831-77277f70c75a] File "/opt/stack/new/nova/nova/virt/libvirt/driver.py", line 7955, in _live_migration_monitor
Jul 08 17:55:21.915336 ubuntu-bionic-rax-iad-0008752810 nova-compute[1203]: ERROR nova.virt.libvirt.driver [instance: 1d85e89c-7a1b-4b85-b831-77277f70c75a] info = guest.get_job_info()
Jul 08 17:55:21.915336 ubuntu-bionic-rax-iad-0008752810 nova-compute[1203]: ERROR nova.virt.libvirt.driver [instance: 1d85e89c-7a1b-4b85-b831-77277f70c75a] File "/opt/stack/new/nova/nova/virt/libvirt/guest.py", line 698, in get_job_info
Jul 08 17:55:21.915336 ubuntu-bionic-rax-iad-0008752810 nova-compute[1203]: ERROR nova.virt.libvirt.driver [instance: 1d85e89c-7a1b-4b85-b831-77277f70c75a] stats = self._domain.jobStats()
Jul 08 17:55:21.915336 ubuntu-bionic-rax-iad-0008752810 nova-compute[1203]: ERROR nova.virt.libvirt.driver [instance: 1d85e89c-7a1b-4b85-b831-77277f70c75a] File "/usr/local/lib/python2.7/dist-packages/eventlet/tpool.py", line 190, in doit
Jul 08 17:55:21.915336 ubuntu-bionic-rax-iad-0008752810 nova-compute[1203]: ERROR nova.virt.libvirt.driver [instance: 1d85e89c-7a1b-4b85-b831-77277f70c75a] result = proxy_call(self._autowrap, f, *args, **kwargs)
Jul 08 17:55:21.915336 ubuntu-bionic-rax-iad-0008752810 nova-compute[1203]: ERROR nova.virt.libvirt.driver [instance: 1d85e89c-7a1b-4b85-b831-77277f70c75a] File "/usr/local/lib/python2.7/dist-packages/eventlet/tpool.py", line 148, in proxy_call
Jul 08 17:55:21.915336 ubuntu-bionic-rax-iad-0008752810 nova-compute[1203]: ERROR nova.virt.libvirt.driver [instance: 1d85e89c-7a1b-4b85-b831-77277f70c75a] rv = execute(f, *args, **kwargs)
Jul 08 17:55:21.915336 ubuntu-bionic-rax-iad-0008752810 nova-compute[1203]: ERROR nova.virt.libvirt.driver [instance: 1d85e89c-7a1b-4b85-b831-772...

Read more...

Revision history for this message
王义军 (armitagewxj) wrote :

I also encountered this error. Using virsh to execute the qga command has this error。Resolved after restarting libvirtd.
libvirt version:4.5.0

To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Duplicates of this bug

Other bug subscribers

Bug attachments

Remote bug watches

Bug watches keep track of this bug in other bug trackers.