yahoo-eng-team team mailing list archive
-
yahoo-eng-team team
-
Mailing list archive
-
Message #72869
[Bug 1770786] [NEW] Server delete times out due to "Cannot destroy instance, operation time out: libvirtError: Timed out during operation: cannot acquire state change lock (held by remoteDispatchDomainBlockJobAbort)"
Public bug reported:
Seen in a CI run here:
http://logs.openstack.org/82/567682/1/check/nova-
next/3920a7e/logs/screen-n-cpu.txt.gz?level=TRACE#_May_11_14_56_54_842884
May 11 14:56:54.842884 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: WARNING nova.virt.libvirt.driver [None req-1b69a74a-b7c5-4c29-af3d-c86a55ef7f4f service nova] [instance: 65a88c43-9287-408d-b9aa-861454939959] Cannot destroy instance, operation time out: libvirtError: Timed out during operation: cannot acquire state change lock (held by remoteDispatchDomainBlockJobAbort)
May 11 14:56:54.876958 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [None req-1b69a74a-b7c5-4c29-af3d-c86a55ef7f4f service nova] [instance: 65a88c43-9287-408d-b9aa-861454939959] Setting instance vm_state to ERROR: InstancePowerOffFailure: Failed to power off instance: operation time out
May 11 14:56:54.877131 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] Traceback (most recent call last):
May 11 14:56:54.877336 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] File "/opt/stack/new/nova/nova/compute/manager.py", line 2528, in do_terminate_instance
May 11 14:56:54.877482 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] self._delete_instance(context, instance, bdms)
May 11 14:56:54.877622 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] File "/opt/stack/new/nova/nova/hooks.py", line 154, in inner
May 11 14:56:54.877761 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] rv = f(*args, **kwargs)
May 11 14:56:54.877910 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] File "/opt/stack/new/nova/nova/compute/manager.py", line 2471, in _delete_instance
May 11 14:56:54.878050 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] self._shutdown_instance(context, instance, bdms)
May 11 14:56:54.878216 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] File "/opt/stack/new/nova/nova/compute/manager.py", line 2356, in _shutdown_instance
May 11 14:56:54.878365 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] pass
May 11 14:56:54.878505 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] File "/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 220, in __exit__
May 11 14:56:54.878642 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] self.force_reraise()
May 11 14:56:54.878792 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] File "/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 196, in force_reraise
May 11 14:56:54.878936 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] six.reraise(self.type_, self.value, self.tb)
May 11 14:56:54.879100 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] File "/opt/stack/new/nova/nova/compute/manager.py", line 2350, in _shutdown_instance
May 11 14:56:54.880077 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] block_device_info)
May 11 14:56:54.880297 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] File "/opt/stack/new/nova/nova/virt/libvirt/driver.py", line 930, in destroy
May 11 14:56:54.880488 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] self._destroy(instance)
May 11 14:56:54.880688 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] File "/opt/stack/new/nova/nova/virt/libvirt/driver.py", line 850, in _destroy
May 11 14:56:54.880878 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] raise exception.InstancePowerOffFailure(reason=reason)
May 11 14:56:54.881097 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] InstancePowerOffFailure: Failed to power off instance: operation time out
May 11 14:56:54.881322 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959]
This is only showing up in Rocky so probably due to using the newer
libvirt version in the Queens UCA, which is version 4.0.0:
ii libvirt-bin 4.0.0-1ubuntu7~cloud0
amd64 programs for the libvirt library
http://logstash.openstack.org/#dashboard/file/logstash.json?query=message%3A%5C%22Cannot%20destroy%20instance%2C%20operation%20time%20out%3A%20libvirtError%3A%20Timed%20out%20during%20operation%3A%20cannot%20acquire%20state%20change%20lock%20(held%20by%20remoteDispatchDomainBlockJobAbort)%5C%22%20AND%20tags%3A%5C%22screen-n-cpu.txt%5C%22&from=7d
6 hits in 7 days, check and gate, all failures.
** Affects: nova
Importance: Medium
Status: Confirmed
** Tags: libvirt
--
You received this bug notification because you are a member of Yahoo!
Engineering Team, which is subscribed to OpenStack Compute (nova).
https://bugs.launchpad.net/bugs/1770786
Title:
Server delete times out due to "Cannot destroy instance, operation
time out: libvirtError: Timed out during operation: cannot acquire
state change lock (held by remoteDispatchDomainBlockJobAbort)"
Status in OpenStack Compute (nova):
Confirmed
Bug description:
Seen in a CI run here:
http://logs.openstack.org/82/567682/1/check/nova-
next/3920a7e/logs/screen-n-cpu.txt.gz?level=TRACE#_May_11_14_56_54_842884
May 11 14:56:54.842884 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: WARNING nova.virt.libvirt.driver [None req-1b69a74a-b7c5-4c29-af3d-c86a55ef7f4f service nova] [instance: 65a88c43-9287-408d-b9aa-861454939959] Cannot destroy instance, operation time out: libvirtError: Timed out during operation: cannot acquire state change lock (held by remoteDispatchDomainBlockJobAbort)
May 11 14:56:54.876958 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [None req-1b69a74a-b7c5-4c29-af3d-c86a55ef7f4f service nova] [instance: 65a88c43-9287-408d-b9aa-861454939959] Setting instance vm_state to ERROR: InstancePowerOffFailure: Failed to power off instance: operation time out
May 11 14:56:54.877131 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] Traceback (most recent call last):
May 11 14:56:54.877336 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] File "/opt/stack/new/nova/nova/compute/manager.py", line 2528, in do_terminate_instance
May 11 14:56:54.877482 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] self._delete_instance(context, instance, bdms)
May 11 14:56:54.877622 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] File "/opt/stack/new/nova/nova/hooks.py", line 154, in inner
May 11 14:56:54.877761 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] rv = f(*args, **kwargs)
May 11 14:56:54.877910 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] File "/opt/stack/new/nova/nova/compute/manager.py", line 2471, in _delete_instance
May 11 14:56:54.878050 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] self._shutdown_instance(context, instance, bdms)
May 11 14:56:54.878216 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] File "/opt/stack/new/nova/nova/compute/manager.py", line 2356, in _shutdown_instance
May 11 14:56:54.878365 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] pass
May 11 14:56:54.878505 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] File "/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 220, in __exit__
May 11 14:56:54.878642 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] self.force_reraise()
May 11 14:56:54.878792 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] File "/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 196, in force_reraise
May 11 14:56:54.878936 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] six.reraise(self.type_, self.value, self.tb)
May 11 14:56:54.879100 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] File "/opt/stack/new/nova/nova/compute/manager.py", line 2350, in _shutdown_instance
May 11 14:56:54.880077 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] block_device_info)
May 11 14:56:54.880297 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] File "/opt/stack/new/nova/nova/virt/libvirt/driver.py", line 930, in destroy
May 11 14:56:54.880488 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] self._destroy(instance)
May 11 14:56:54.880688 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] File "/opt/stack/new/nova/nova/virt/libvirt/driver.py", line 850, in _destroy
May 11 14:56:54.880878 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] raise exception.InstancePowerOffFailure(reason=reason)
May 11 14:56:54.881097 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959] InstancePowerOffFailure: Failed to power off instance: operation time out
May 11 14:56:54.881322 ubuntu-xenial-ovh-bhs1-0003978700 nova-compute[32419]: ERROR nova.compute.manager [instance: 65a88c43-9287-408d-b9aa-861454939959]
This is only showing up in Rocky so probably due to using the newer
libvirt version in the Queens UCA, which is version 4.0.0:
ii libvirt-bin 4.0.0-1ubuntu7~cloud0
amd64 programs for the libvirt library
http://logstash.openstack.org/#dashboard/file/logstash.json?query=message%3A%5C%22Cannot%20destroy%20instance%2C%20operation%20time%20out%3A%20libvirtError%3A%20Timed%20out%20during%20operation%3A%20cannot%20acquire%20state%20change%20lock%20(held%20by%20remoteDispatchDomainBlockJobAbort)%5C%22%20AND%20tags%3A%5C%22screen-n-cpu.txt%5C%22&from=7d
6 hits in 7 days, check and gate, all failures.
To manage notifications about this bug go to:
https://bugs.launchpad.net/nova/+bug/1770786/+subscriptions