← Back to team overview

yahoo-eng-team team mailing list archive

[Bug 1887946] Fix included in openstack/nova rocky-eol

 

This issue was fixed in the openstack/nova rocky-eol  release.

** Changed in: nova/rocky
       Status: Fix Committed => Fix Released

-- 
You received this bug notification because you are a member of Yahoo!
Engineering Team, which is subscribed to OpenStack Compute (nova).
https://bugs.launchpad.net/bugs/1887946

Title:
  Unable to detach volume from instance when previously removed from the
  inactive config

Status in OpenStack Compute (nova):
  Fix Released
Status in OpenStack Compute (nova) queens series:
  Fix Released
Status in OpenStack Compute (nova) rocky series:
  Fix Released
Status in OpenStack Compute (nova) stein series:
  Fix Released
Status in OpenStack Compute (nova) train series:
  Fix Released
Status in OpenStack Compute (nova) ussuri series:
  Fix Released

Bug description:
  Description
  ===========
  $subject, can often be encountered when previous attempts to detach a volume have failed due to the device still being used within the guestOS.

  This initial attempt will remove the device from the inactive config
  but fail to remove it from the active config. Any subsequent attempt
  will then fail as the initial call continues to attempt to remove the
  device from both the inactive and live configs.

  Prior to libvirt v4.1.0 this raised either a VIR_ERR_INVALID_ARG or
  VIR_ERR_OPERATION_FAILED error code from libvirt that n-cpu would
  handle, retrying the detach against the live config.

  Since libvirt v4.1.0 however this now raises a VIR_ERR_DEVICE_MISSING
  error code. This is not handled by Nova resulting in no attempt being
  made to detach the device from the live config.

  Steps to reproduce
  ==================

  # Start with a volume attached as vdb (ignore the source ;))

  $ sudo virsh domblklist 4b1a0828-8dcc-4b73-a05e-5b50cb62c8f8
   Target   Source
  ------------------------------------------------------------------------------------
   vda      /opt/stack/data/nova/instances/4b1a0828-8dcc-4b73-a05e-5b50cb62c8f8/disk
   vdb      iqn.2010-10.org.openstack:volume-37cc97fa-9776-4b31-8f3f-cb1f18ff1db6/0

  # Detach from the inactive config

  $ sudo virsh detach-disk --config 4b1a0828-8dcc-4b73-a05e-5b50cb62c8f8 vdb
  Disk detached successfully

  # Confirm the device is still listed on the live config

  $ sudo virsh domblklist 4b1a0828-8dcc-4b73-a05e-5b50cb62c8f8
   Target   Source
  ------------------------------------------------------------------------------------
   vda      /opt/stack/data/nova/instances/4b1a0828-8dcc-4b73-a05e-5b50cb62c8f8/disk
   vdb      iqn.2010-10.org.openstack:volume-37cc97fa-9776-4b31-8f3f-cb1f18ff1db6/0

  # and removed from the persistent config

  $ sudo virsh domblklist --inactive 4b1a0828-8dcc-4b73-a05e-5b50cb62c8f8
   Target   Source
  ------------------------------------------------------------------------------------
   vda      /opt/stack/data/nova/instances/4b1a0828-8dcc-4b73-a05e-5b50cb62c8f8/disk

  # Attempt to detach the volume

  $ openstack server remove volume 4b1a0828-8dcc-4b73-a05e-5b50cb62c8f8
  test

  Expected result
  ===============
  The initial attempt to detach the device fails as the device isn't present in the inactive config but we continue to ensure the device is removed from the live config.

  Actual result
  =============
  n-cpu doesn't handle the initial failure as the raised libvirt error code isn't recongnised.

  Environment
  ===========
  1. Exact version of OpenStack you are running. See the following
    list for all releases: http://docs.openstack.org/releases/

     b7161fe9b92f0045e97c300a80e58d32b6f49be1

  2. Which hypervisor did you use?
     (For example: Libvirt + KVM, Libvirt + XEN, Hyper-V, PowerKVM, ...)
     What's the version of that?

     libvirt + KVM

  2. Which storage type did you use?
     (For example: Ceph, LVM, GPFS, ...)
     What's the version of that?

     N/A

  3. Which networking type did you use?
     (For example: nova-network, Neutron with OpenVSwitch, ...)

     N/A

  Logs & Configs
  ==============

  $ openstack server remove volume 4b1a0828-8dcc-4b73-a05e-5b50cb62c8f8 test ; journalctl -u devstack@n-cpu -f
  [..]
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: DEBUG oslo_concurrency.lockutils [None req-16d62ef9-d492-4012-bb6d-37e5611ede50 admin admin] Lock "4b1a0828-8dcc-4b73-a05e-5b50cb62c8f8" released by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 0.141s {{(pid=190210) inner /usr/local/lib/python3.7/site-packages/oslo_concurrency/lockutils.py:371}}
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server [None req-16d62ef9-d492-4012-bb6d-37e5611ede50 admin admin] Exception during message handling: libvirt.libvirtError: device not found: no target device vdb
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server Traceback (most recent call last):
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python3.7/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server     res = self.dispatcher.dispatch(message)
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python3.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 273, in dispatch
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server     return self._do_dispatch(endpoint, method, ctxt, args)
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python3.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 193, in _do_dispatch
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server     result = func(ctxt, **new_args)
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server   File "/opt/stack/nova/nova/exception_wrapper.py", line 78, in wrapped
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server     function_name, call_dict, binary)
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python3.7/site-packages/oslo_utils/excutils.py", line 220, in __exit__
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server     self.force_reraise()
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python3.7/site-packages/oslo_utils/excutils.py", line 196, in force_reraise
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server     six.reraise(self.type_, self.value, self.tb)
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python3.7/site-packages/six.py", line 703, in reraise
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server     raise value
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server   File "/opt/stack/nova/nova/exception_wrapper.py", line 69, in wrapped
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server     return f(self, context, *args, **kw)
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server   File "/opt/stack/nova/nova/compute/utils.py", line 1440, in decorated_function
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server     return function(self, context, *args, **kwargs)
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server   File "/opt/stack/nova/nova/compute/manager.py", line 216, in decorated_function
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server     kwargs['instance'], e, sys.exc_info())
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python3.7/site-packages/oslo_utils/excutils.py", line 220, in __exit__
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server     self.force_reraise()
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python3.7/site-packages/oslo_utils/excutils.py", line 196, in force_reraise
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server     six.reraise(self.type_, self.value, self.tb)
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python3.7/site-packages/six.py", line 703, in reraise
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server     raise value
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server   File "/opt/stack/nova/nova/compute/manager.py", line 204, in decorated_function
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server     return function(self, context, *args, **kwargs)
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server   File "/opt/stack/nova/nova/compute/manager.py", line 7099, in detach_volume
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server     do_detach_volume(context, volume_id, instance, attachment_id)
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python3.7/site-packages/oslo_concurrency/lockutils.py", line 360, in inner
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server     return f(*args, **kwargs)
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server   File "/opt/stack/nova/nova/compute/manager.py", line 7097, in do_detach_volume
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server     attachment_id=attachment_id)
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server   File "/opt/stack/nova/nova/compute/manager.py", line 7048, in _detach_volume
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server     attachment_id=attachment_id, destroy_bdm=destroy_bdm)
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server   File "/opt/stack/nova/nova/virt/block_device.py", line 477, in detach
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server     attachment_id, destroy_bdm)
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server   File "/opt/stack/nova/nova/virt/block_device.py", line 408, in _do_detach
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server     self.driver_detach(context, instance, volume_api, virt_driver)
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server   File "/opt/stack/nova/nova/virt/block_device.py", line 347, in driver_detach
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server     volume_api.roll_detaching(context, volume_id)
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python3.7/site-packages/oslo_utils/excutils.py", line 220, in __exit__
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server     self.force_reraise()
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python3.7/site-packages/oslo_utils/excutils.py", line 196, in force_reraise
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server     six.reraise(self.type_, self.value, self.tb)
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python3.7/site-packages/six.py", line 703, in reraise
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server     raise value
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server   File "/opt/stack/nova/nova/virt/block_device.py", line 329, in driver_detach
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server     encryption=encryption)
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server   File "/opt/stack/nova/nova/virt/libvirt/driver.py", line 2019, in detach_volume
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server     live=live)
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server   File "/opt/stack/nova/nova/virt/libvirt/guest.py", line 425, in detach_device_with_retry
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server     _try_detach_device(conf, persistent, live)
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server   File "/opt/stack/nova/nova/virt/libvirt/guest.py", line 414, in _try_detach_device
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server     ctx.reraise = True
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python3.7/site-packages/oslo_utils/excutils.py", line 220, in __exit__
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server     self.force_reraise()
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python3.7/site-packages/oslo_utils/excutils.py", line 196, in force_reraise
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server     six.reraise(self.type_, self.value, self.tb)
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python3.7/site-packages/six.py", line 703, in reraise
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server     raise value
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server   File "/opt/stack/nova/nova/virt/libvirt/guest.py", line 387, in _try_detach_device
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server     self.detach_device(conf, persistent=persistent, live=live)
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server   File "/opt/stack/nova/nova/virt/libvirt/guest.py", line 475, in detach_device
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server     self._domain.detachDeviceFlags(device_xml, flags=flags)
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python3.7/site-packages/eventlet/tpool.py", line 190, in doit
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server     result = proxy_call(self._autowrap, f, *args, **kwargs)
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python3.7/site-packages/eventlet/tpool.py", line 148, in proxy_call
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server     rv = execute(f, *args, **kwargs)
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python3.7/site-packages/eventlet/tpool.py", line 129, in execute
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server     six.reraise(c, e, tb)
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python3.7/site-packages/six.py", line 703, in reraise
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server     raise value
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python3.7/site-packages/eventlet/tpool.py", line 83, in tworker
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server     rv = meth(*args, **kwargs)
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib64/python3.7/site-packages/libvirt.py", line 1309, in detachDeviceFlags
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server     if ret == -1: raise libvirtError ('virDomainDetachDeviceFlags() failed', dom=self)
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server libvirt.libvirtError: device not found: no target device vdb
  Jul 16 17:26:53 localhost.localdomain nova-compute[190210]: ERROR oslo_messaging.rpc.server

To manage notifications about this bug go to:
https://bugs.launchpad.net/nova/+bug/1887946/+subscriptions



References