← Back to team overview

yahoo-eng-team team mailing list archive

[Bug 1349888] Re: Attempting to attach the same volume multiple times can cause bdm record for existing attachment to be deleted.

 

** Also affects: nova (Ubuntu)
   Importance: Undecided
       Status: New

** Also affects: nova (Ubuntu Trusty)
   Importance: Undecided
       Status: New

-- 
You received this bug notification because you are a member of Yahoo!
Engineering Team, which is subscribed to OpenStack Compute (nova).
https://bugs.launchpad.net/bugs/1349888

Title:
  Attempting to attach the same volume multiple times can cause bdm
  record for existing attachment to be deleted.

Status in OpenStack Compute (nova):
  Fix Released
Status in nova package in Ubuntu:
  New
Status in nova source package in Trusty:
  New

Bug description:
  nova assumes there is only ever one bdm per volume. When an attach is
  initiated a new bdm is created, if the attach fails a bdm for the
  volume is deleted however it is not necessarily the one that was just
  created. The following steps show how a volume can get stuck detaching
  because of this.

  
  $ nova list
  c+--------------------------------------+--------+--------+------------+-------------+------------------+
  | ID                                   | Name   | Status | Task State | Power State | Networks         |
  +--------------------------------------+--------+--------+------------+-------------+------------------+
  | cb5188f8-3fe1-4461-8a9d-3902f7cc8296 | test13 | ACTIVE | -          | Running     | private=10.0.0.2 |
  +--------------------------------------+--------+--------+------------+-------------+------------------+

  $ cinder list
  +--------------------------------------+-----------+--------+------+-------------+----------+-------------+
  |                  ID                  |   Status  |  Name  | Size | Volume Type | Bootable | Attached to |
  +--------------------------------------+-----------+--------+------+-------------+----------+-------------+
  | c1e38e93-d566-4c99-bfc3-42e77a428cc4 | available | test10 |  1   |     lvm1    |  false   |             |
  +--------------------------------------+-----------+--------+------+-------------+----------+-------------+

  $ nova volume-attach test13 c1e38e93-d566-4c99-bfc3-42e77a428cc4
  +----------+--------------------------------------+
  | Property | Value                                |
  +----------+--------------------------------------+
  | device   | /dev/vdb                             |
  | id       | c1e38e93-d566-4c99-bfc3-42e77a428cc4 |
  | serverId | cb5188f8-3fe1-4461-8a9d-3902f7cc8296 |
  | volumeId | c1e38e93-d566-4c99-bfc3-42e77a428cc4 |
  +----------+--------------------------------------+

  $ cinder list
  +--------------------------------------+--------+--------+------+-------------+----------+--------------------------------------+
  |                  ID                  | Status |  Name  | Size | Volume Type | Bootable |             Attached to              |
  +--------------------------------------+--------+--------+------+-------------+----------+--------------------------------------+
  | c1e38e93-d566-4c99-bfc3-42e77a428cc4 | in-use | test10 |  1   |     lvm1    |  false   | cb5188f8-3fe1-4461-8a9d-3902f7cc8296 |
  +--------------------------------------+--------+--------+------+-------------+----------+--------------------------------------+

  $ nova volume-attach test13 c1e38e93-d566-4c99-bfc3-42e77a428cc4
  ERROR (BadRequest): Invalid volume: status must be 'available' (HTTP 400) (Request-ID: req-1fa34b54-25b5-4296-9134-b63321b0015d)

  $ nova volume-detach test13 c1e38e93-d566-4c99-bfc3-42e77a428cc4

  $ cinder list
  +--------------------------------------+-----------+--------+------+-------------+----------+--------------------------------------+
  |                  ID                  |   Status  |  Name  | Size | Volume Type | Bootable |             Attached to              |
  +--------------------------------------+-----------+--------+------+-------------+----------+--------------------------------------+
  | c1e38e93-d566-4c99-bfc3-42e77a428cc4 | detaching | test10 |  1   |     lvm1    |  false   | cb5188f8-3fe1-4461-8a9d-3902f7cc8296 |
  +--------------------------------------+-----------+--------+------+-------------+----------+--------------------------------------+


  
  2014-07-29 14:47:13.952 ERROR oslo.messaging.rpc.dispatcher [req-134dfd17-14da-4de0-93fc-5d8d7bbf65a5 admin admin] Exception during message handling: <type 'NoneType'> can't be decoded
  2014-07-29 14:47:13.952 31588 TRACE oslo.messaging.rpc.dispatcher Traceback (most recent call last):
  2014-07-29 14:47:13.952 31588 TRACE oslo.messaging.rpc.dispatcher   File "/usr/local/lib/python2.7/dist-packages/oslo/messaging/rpc/dispatcher.py", line 134, in _dispatch_and_reply
  2014-07-29 14:47:13.952 31588 TRACE oslo.messaging.rpc.dispatcher     incoming.message))
  2014-07-29 14:47:13.952 31588 TRACE oslo.messaging.rpc.dispatcher   File "/usr/local/lib/python2.7/dist-packages/oslo/messaging/rpc/dispatcher.py", line 177, in _dispatch
  2014-07-29 14:47:13.952 31588 TRACE oslo.messaging.rpc.dispatcher     return self._do_dispatch(endpoint, method, ctxt, args)
  2014-07-29 14:47:13.952 31588 TRACE oslo.messaging.rpc.dispatcher   File "/usr/local/lib/python2.7/dist-packages/oslo/messaging/rpc/dispatcher.py", line 123, in _do_dispatch
  2014-07-29 14:47:13.952 31588 TRACE oslo.messaging.rpc.dispatcher     result = getattr(endpoint, method)(ctxt, **new_args)
  2014-07-29 14:47:13.952 31588 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/nova/nova/compute/manager.py", line 406, in decorated_function
  2014-07-29 14:47:13.952 31588 TRACE oslo.messaging.rpc.dispatcher     return function(self, context, *args, **kwargs)
  2014-07-29 14:47:13.952 31588 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/nova/nova/exception.py", line 88, in wrapped
  2014-07-29 14:47:13.952 31588 TRACE oslo.messaging.rpc.dispatcher     payload)
  2014-07-29 14:47:13.952 31588 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/nova/nova/openstack/common/excutils.py", line 82, in __exit__
  2014-07-29 14:47:13.952 31588 TRACE oslo.messaging.rpc.dispatcher     six.reraise(self.type_, self.value, self.tb)
  2014-07-29 14:47:13.952 31588 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/nova/nova/exception.py", line 71, in wrapped
  2014-07-29 14:47:13.952 31588 TRACE oslo.messaging.rpc.dispatcher     return f(self, context, *args, **kw)
  2014-07-29 14:47:13.952 31588 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/nova/nova/compute/manager.py", line 291, in decorated_function
  2014-07-29 14:47:13.952 31588 TRACE oslo.messaging.rpc.dispatcher     pass
  2014-07-29 14:47:13.952 31588 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/nova/nova/openstack/common/excutils.py", line 82, in __exit__
  2014-07-29 14:47:13.952 31588 TRACE oslo.messaging.rpc.dispatcher     six.reraise(self.type_, self.value, self.tb)
  2014-07-29 14:47:13.952 31588 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/nova/nova/compute/manager.py", line 277, in decorated_function
  2014-07-29 14:47:13.952 31588 TRACE oslo.messaging.rpc.dispatcher     return function(self, context, *args, **kwargs)
  2014-07-29 14:47:13.952 31588 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/nova/nova/compute/manager.py", line 319, in decorated_function
  2014-07-29 14:47:13.952 31588 TRACE oslo.messaging.rpc.dispatcher     kwargs['instance'], e, sys.exc_info())
  2014-07-29 14:47:13.952 31588 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/nova/nova/openstack/common/excutils.py", line 82, in __exit__
  2014-07-29 14:47:13.952 31588 TRACE oslo.messaging.rpc.dispatcher     six.reraise(self.type_, self.value, self.tb)
  2014-07-29 14:47:13.952 31588 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/nova/nova/compute/manager.py", line 307, in decorated_function
  2014-07-29 14:47:13.952 31588 TRACE oslo.messaging.rpc.dispatcher     return function(self, context, *args, **kwargs)
  2014-07-29 14:47:13.952 31588 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/nova/nova/compute/manager.py", line 4363, in detach_volume
  2014-07-29 14:47:13.952 31588 TRACE oslo.messaging.rpc.dispatcher     self._detach_volume(context, instance, bdm)
  2014-07-29 14:47:13.952 31588 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/nova/nova/compute/manager.py", line 4309, in _detach_volume
  2014-07-29 14:47:13.952 31588 TRACE oslo.messaging.rpc.dispatcher     connection_info = jsonutils.loads(bdm.connection_info)
  2014-07-29 14:47:13.952 31588 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/nova/nova/openstack/common/jsonutils.py", line 176, in loads
  2014-07-29 14:47:13.952 31588 TRACE oslo.messaging.rpc.dispatcher     return json.loads(strutils.safe_decode(s, encoding), **kwargs)
  2014-07-29 14:47:13.952 31588 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/nova/nova/openstack/common/strutils.py", line 134, in safe_decode
  2014-07-29 14:47:13.952 31588 TRACE oslo.messaging.rpc.dispatcher     raise TypeError("%s can't be decoded" % type(text))
  2014-07-29 14:47:13.952 31588 TRACE oslo.messaging.rpc.dispatcher TypeError: <type 'NoneType'> can't be decoded
  2014-07-29 14:47:13.952 31588 TRACE oslo.messaging.rpc.dispatcher

To manage notifications about this bug go to:
https://bugs.launchpad.net/nova/+bug/1349888/+subscriptions


References