← Back to team overview

yahoo-eng-team team mailing list archive

[Bug 1329333] Re: BadRequest: Invalid volume: Volume status must be available or error

 

Nova libvirt driver failing  to detach volume.

n-api logs-

2014-10-06 21:13:37.560 AUDIT nova.api.openstack.compute.contrib.volumes
[req-f87a213f-6677-4288-b91e-25769f55a2f3
TestEncryptedCinderVolumes-1235148374
TestEncryptedCinderVolumes-731143481] Detach volume
ec116004-afd7-4131-9ee8-02ab666ec7bd

-----------------------------------------------------------------------------------------------------------------------
c-api logs-

Begin detaching -

2014-10-06 21:13:37.864 18627 INFO cinder.api.openstack.wsgi [req-57cf26e1-8cdd-4e20-943c-393aba8286fd 980965010fee4b7f800ef366726b5927 ba5e42d2f06340058633ad1a5a84b1b1 - - -] POST http://127.0.0.1:8776/v1/ba5e42d2f06340058633ad1a5a84b1b1/volumes/ec116004-afd7-4131-9ee8-02ab666ec7bd/action
2014-10-06 21:13:37.865 18627 DEBUG cinder.api.openstack.wsgi [req-57cf26e1-8cdd-4e20-943c-393aba8286fd 980965010fee4b7f800ef366726b5927 ba5e42d2f06340058633ad1a5a84b1b1 - - -] Action body: {"os-begin_detaching": null} get_method /opt/stack/new/cinder/cinder/api/openstack/wsgi.py:1008
-----------------------------------------------------------------------------------------------------------------
Status changed to  Detaching - 

2014-10-06 21:13:38.078 18627 AUDIT cinder.api.v1.volumes [req-9b3ab70e-897b-4d27-80d1-89d5678a481f 980965010fee4b7f800ef366726b5927 ba5e42d2f06340058633ad1a5a84b1b1 - - -] vol={'migration_status': None, 'availability_zone': u'nova', 'terminated_at': None, 'updated_at': datetime.datetime(2014, 10, 6, 21, 13, 37), 'provider_geometry': None, 'snapshot_id': None, 'ec2_id': None, 'mountpoint': u'/dev/vdb', 'deleted_at': None, 'id': u'ec116004-afd7-4131-9ee8-02ab666ec7bd', 'size': 1L, 'user_id': u'980965010fee4b7f800ef366726b5927', 'attach_time': u'2014-10-06T21:13:35.855790', 'attached_host': None, 'display_description': None, 'volume_admin_metadata': [<cinder.db.sqlalchemy.models.VolumeAdminMetadata object at 0x4e80e90>, <cinder.db.sqlalchemy.models.VolumeAdminMetadata object at 0x5c1b250>], 'encryption_key_id': u'00000000-0000-0000-0000-000000000000', 'project_id': u'ba5e42d2f06340058633ad1a5a84b1b1', 'launched_at': datetime.datetime(2014, 10, 6, 21, 13, 29), 'scheduled_at': datetime.datetime(2014, 10, 6, 21, 13, 29), 'status': u'detaching', 'volume_type_id': u'03ce3467-70d3-442f-a93b-4bcba3ac662a', 'deleted': False, 'provider_location': 
----------------------------------------------------------------------------------------------------------------------
n-cpu log-  error from driver while detaching volume

2014-10-06 21:13:39.391 18441 TRACE oslo.messaging.rpc.dispatcher Traceback (most recent call last):
2014-10-06 21:13:39.391 18441 TRACE oslo.messaging.rpc.dispatcher   File "/usr/local/lib/python2.7/dist-packages/oslo/messaging/rpc/dispatcher.py", line 133, in _dispatch_and_reply
2014-10-06 21:13:39.391 18441 TRACE oslo.messaging.rpc.dispatcher     incoming.message))
2014-10-06 21:13:39.391 18441 TRACE oslo.messaging.rpc.dispatcher   File "/usr/local/lib/python2.7/dist-packages/oslo/messaging/rpc/dispatcher.py", line 176, in _dispatch
2014-10-06 21:13:39.391 18441 TRACE oslo.messaging.rpc.dispatcher     return self._do_dispatch(endpoint, method, ctxt, args)
2014-10-06 21:13:39.391 18441 TRACE oslo.messaging.rpc.dispatcher   File "/usr/local/lib/python2.7/dist-packages/oslo/messaging/rpc/dispatcher.py", line 122, in _do_dispatch
2014-10-06 21:13:39.391 18441 TRACE oslo.messaging.rpc.dispatcher     result = getattr(endpoint, method)(ctxt, **new_args)
2014-10-06 21:13:39.391 18441 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/new/nova/nova/exception.py", line 88, in wrapped
2014-10-06 21:13:39.391 18441 TRACE oslo.messaging.rpc.dispatcher     payload)
2014-10-06 21:13:39.391 18441 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/new/nova/nova/openstack/common/excutils.py", line 68, in __exit__
2014-10-06 21:13:39.391 18441 TRACE oslo.messaging.rpc.dispatcher     six.reraise(self.type_, self.value, self.tb)
2014-10-06 21:13:39.391 18441 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/new/nova/nova/exception.py", line 71, in wrapped
2014-10-06 21:13:39.391 18441 TRACE oslo.messaging.rpc.dispatcher     return f(self, context, *args, **kw)
2014-10-06 21:13:39.391 18441 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/new/nova/nova/compute/manager.py", line 274, in decorated_function
2014-10-06 21:13:39.391 18441 TRACE oslo.messaging.rpc.dispatcher     pass
2014-10-06 21:13:39.391 18441 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/new/nova/nova/openstack/common/excutils.py", line 68, in __exit__
2014-10-06 21:13:39.391 18441 TRACE oslo.messaging.rpc.dispatcher     six.reraise(self.type_, self.value, self.tb)
2014-10-06 21:13:39.391 18441 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/new/nova/nova/compute/manager.py", line 260, in decorated_function
2014-10-06 21:13:39.391 18441 TRACE oslo.messaging.rpc.dispatcher     return function(self, context, *args, **kwargs)
2014-10-06 21:13:39.391 18441 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/new/nova/nova/compute/manager.py", line 303, in decorated_function
2014-10-06 21:13:39.391 18441 TRACE oslo.messaging.rpc.dispatcher     e, sys.exc_info())
2014-10-06 21:13:39.391 18441 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/new/nova/nova/openstack/common/excutils.py", line 68, in __exit__
2014-10-06 21:13:39.391 18441 TRACE oslo.messaging.rpc.dispatcher     six.reraise(self.type_, self.value, self.tb)
2014-10-06 21:13:39.391 18441 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/new/nova/nova/compute/manager.py", line 290, in decorated_function
2014-10-06 21:13:39.391 18441 TRACE oslo.messaging.rpc.dispatcher     return function(self, context, *args, **kwargs)
2014-10-06 21:13:39.391 18441 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/new/nova/nova/compute/manager.py", line 4259, in detach_volume
2014-10-06 21:13:39.391 18441 TRACE oslo.messaging.rpc.dispatcher     self._detach_volume(context, instance, bdm)
2014-10-06 21:13:39.391 18441 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/new/nova/nova/compute/manager.py", line 4229, in _detach_volume
2014-10-06 21:13:39.391 18441 TRACE oslo.messaging.rpc.dispatcher     self.volume_api.roll_detaching(context, volume_id)
2014-10-06 21:13:39.391 18441 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/new/nova/nova/openstack/common/excutils.py", line 68, in __exit__
2014-10-06 21:13:39.391 18441 TRACE oslo.messaging.rpc.dispatcher     six.reraise(self.type_, self.value, self.tb)
2014-10-06 21:13:39.391 18441 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/new/nova/nova/compute/manager.py", line 4222, in _detach_volume
2014-10-06 21:13:39.391 18441 TRACE oslo.messaging.rpc.dispatcher     encryption=encryption)
2014-10-06 21:13:39.391 18441 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/new/nova/nova/virt/libvirt/driver.py", line 1392, in detach_volume
2014-10-06 21:13:39.391 18441 TRACE oslo.messaging.rpc.dispatcher     encryptor.detach_volume(**encryption)
2014-10-06 21:13:39.391 18441 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/new/nova/nova/volume/encryptors/cryptsetup.py", line 101, in detach_volume
2014-10-06 21:13:39.391 18441 TRACE oslo.messaging.rpc.dispatcher     self._close_volume(**kwargs)
2014-10-06 21:13:39.391 18441 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/new/nova/nova/volume/encryptors/cryptsetup.py", line 97, in _close_volume
2014-10-06 21:13:39.391 18441 TRACE oslo.messaging.rpc.dispatcher     run_as_root=True, check_exit_code=True)
2014-10-06 21:13:39.391 18441 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/new/nova/nova/utils.py", line 165, in execute
2014-10-06 21:13:39.391 18441 TRACE oslo.messaging.rpc.dispatcher     return processutils.execute(*cmd, **kwargs)
2014-10-06 21:13:39.391 18441 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/new/nova/nova/openstack/common/processutils.py", line 195, in execute
2014-10-06 21:13:39.391 18441 TRACE oslo.messaging.rpc.dispatcher     cmd=sanitized_cmd)
2014-10-06 21:13:39.391 18441 TRACE oslo.messaging.rpc.dispatcher ProcessExecutionError: Unexpected error while running command.
2014-10-06 21:13:39.391 18441 TRACE oslo.messaging.rpc.dispatcher Command: sudo nova-rootwrap /etc/nova/rootwrap.conf cryptsetup remove ip-127.0.0.1:3260-iscsi-iqn.2010-10.org.openstack:volume-ec116004-afd7-4131-9ee8-02ab666ec7bd-lun-1
2014-10-06 21:13:39.391 18441 TRACE oslo.messaging.rpc.dispatcher Exit code: 5
2014-10-06 21:13:39.391 18441 TRACE oslo.messaging.rpc.dispatcher Stdout: u''
2014-10-06 21:13:39.391 18441 TRACE oslo.messaging.rpc.dispatcher Stderr: u'Device ip-127.0.0.1:3260-iscsi-iqn.2010-10.org.openstack:volume-ec116004-afd7-4131-9ee8-02ab666ec7bd-lun-1 is busy.\n'
2014-10-06 21:13:39.391 18441 TRACE oslo.messaging.rpc.dispatcher 

--------------------------------------------------------------------------------------------------------------
Nova  rolling back volume from 'detach' to 'in-use'. - 

2014-10-06 21:13:39.146 18627 INFO cinder.api.openstack.wsgi [req-072e2662-3f8b-4ca5-82e4-569a990bc34c 980965010fee4b7f800ef366726b5927 ba5e42d2f06340058633ad1a5a84b1b1 - - -] POST http://127.0.0.1:8776/v1/ba5e42d2f06340058633ad1a5a84b1b1/volumes/ec116004-afd7-4131-9ee8-02ab666ec7bd/action
2014-10-06 21:13:39.146 18627 DEBUG cinder.api.openstack.wsgi [req-072e2662-3f8b-4ca5-82e4-569a990bc34c 980965010fee4b7f800ef366726b5927 ba5e42d2f06340058633ad1a5a84b1b1 - - -] Action body: {"os-roll_detaching": null} get_method /opt/stack/new/cinder/cinder/api/openstack/wsgi.py:1008


** Also affects: nova
   Importance: Undecided
       Status: New

-- 
You received this bug notification because you are a member of Yahoo!
Engineering Team, which is subscribed to OpenStack Compute (nova).
https://bugs.launchpad.net/bugs/1329333

Title:
  BadRequest: Invalid volume: Volume status must be available or error

Status in OpenStack Compute (Nova):
  New
Status in Tempest:
  Invalid

Bug description:
  traceback from:
  http://logs.openstack.org/40/99540/2/check/check-grenade-dsvm/85c496c/console.html

  
  2014-06-12 13:28:15.833 | tearDownClass (tempest.scenario.test_volume_boot_pattern.TestVolumeBootPattern)
  2014-06-12 13:28:15.833 | -------------------------------------------------------------------------------
  2014-06-12 13:28:15.833 | 
  2014-06-12 13:28:15.833 | Captured traceback:
  2014-06-12 13:28:15.833 | ~~~~~~~~~~~~~~~~~~~
  2014-06-12 13:28:15.833 |     Traceback (most recent call last):
  2014-06-12 13:28:15.833 |       File "tempest/scenario/manager.py", line 157, in tearDownClass
  2014-06-12 13:28:15.833 |         cls.cleanup_resource(thing, cls.__name__)
  2014-06-12 13:28:15.834 |       File "tempest/scenario/manager.py", line 119, in cleanup_resource
  2014-06-12 13:28:15.834 |         resource.delete()
  2014-06-12 13:28:15.834 |       File "/opt/stack/new/python-cinderclient/cinderclient/v1/volumes.py", line 35, in delete
  2014-06-12 13:28:15.834 |         self.manager.delete(self)
  2014-06-12 13:28:15.834 |       File "/opt/stack/new/python-cinderclient/cinderclient/v1/volumes.py", line 228, in delete
  2014-06-12 13:28:15.834 |         self._delete("/volumes/%s" % base.getid(volume))
  2014-06-12 13:28:15.834 |       File "/opt/stack/new/python-cinderclient/cinderclient/base.py", line 162, in _delete
  2014-06-12 13:28:15.834 |         resp, body = self.api.client.delete(url)
  2014-06-12 13:28:15.834 |       File "/opt/stack/new/python-cinderclient/cinderclient/client.py", line 229, in delete
  2014-06-12 13:28:15.834 |         return self._cs_request(url, 'DELETE', **kwargs)
  2014-06-12 13:28:15.834 |       File "/opt/stack/new/python-cinderclient/cinderclient/client.py", line 187, in _cs_request
  2014-06-12 13:28:15.835 |         **kwargs)
  2014-06-12 13:28:15.835 |       File "/opt/stack/new/python-cinderclient/cinderclient/client.py", line 170, in request
  2014-06-12 13:28:15.835 |         raise exceptions.from_response(resp, body)
  2014-06-12 13:28:15.835 |     BadRequest: Invalid volume: Volume status must be available or error, but current status is: in-use (HTTP 400) (Request-ID: req-9337623a-e2b7-48a3-97ab-f7a4845f2cd8)

To manage notifications about this bug go to:
https://bugs.launchpad.net/nova/+bug/1329333/+subscriptions