yahoo-eng-team team mailing list archive
-
yahoo-eng-team team
-
Mailing list archive
-
Message #65039
[Bug 1609984] Re: volume-detach fails for shelved instance
** Changed in: nova/mitaka
Status: Fix Committed => Fix Released
--
You received this bug notification because you are a member of Yahoo!
Engineering Team, which is subscribed to OpenStack Compute (nova).
https://bugs.launchpad.net/bugs/1609984
Title:
volume-detach fails for shelved instance
Status in OpenStack Compute (nova):
Fix Released
Status in OpenStack Compute (nova) mitaka series:
Fix Released
Bug description:
nova/compute/api.py::_local_cleanup_bdm_volumes passes a fake
connector to Cinder to ask it to terminate a connection to a volume.
Many Cinder volume drivers need a valid connector that has a real
'host' value in order to terminate the connection on the array.
The connector being passed in is:
'connector': {u'ip': u'127.0.0.1', u'initiator': u'iqn.fake'}
2016-08-04 13:56:41.672 DEBUG cinder.volume.drivers.hpe.hpe_3par_iscsi [req-6a382dfe-d1a5-47e7-99bc-e2a383124cd8 aa5ab308cd5b47eb9b2798ec9e2abb32 4b136f9898994fec81393c3b8210980b] ==> terminate_connection: call {'volume': <cinder.db.sqla
lchemy.models.Volume object at 0x7f1f2f130d10>, 'connector': {u'ip': u'127.0.0.1', u'initiator': u'iqn.fake'}, 'self': <cinder.volume.drivers.hpe.hpe_3par_iscsi.HPE3PARISCSIDriver object at 0x7f1ee0858950>, 'kwargs': {'force': False}} fr
om (pid=45144) trace_logging_wrapper /opt/stack/cinder/cinder/utils.py:843
2016-08-04 13:56:41.705 DEBUG cinder.volume.drivers.hpe.hpe_3par_common [req-6a382dfe-d1a5-47e7-99bc-e2a383124cd8 aa5ab308cd5b47eb9b2798ec9e2abb32 4b136f9898994fec81393c3b8210980b] Connecting to 3PAR from (pid=45144) client_login /opt/st
ack/cinder/cinder/volume/drivers/hpe/hpe_3par_common.py:350
2016-08-04 13:56:42.164 DEBUG cinder.volume.drivers.hpe.hpe_3par_common [req-6a382dfe-d1a5-47e7-99bc-e2a383124cd8 aa5ab308cd5b47eb9b2798ec9e2abb32 4b136f9898994fec81393c3b8210980b] Disconnect from 3PAR REST and SSH 1278aedb-8579-4776-8d8
5-c46ec93a0551 from (pid=45144) client_logout /opt/stack/cinder/cinder/volume/drivers/hpe/hpe_3par_common.py:374
2016-08-04 13:56:42.187 DEBUG cinder.volume.drivers.hpe.hpe_3par_iscsi [req-6a382dfe-d1a5-47e7-99bc-e2a383124cd8 aa5ab308cd5b47eb9b2798ec9e2abb32 4b136f9898994fec81393c3b8210980b] <== terminate_connection: exception (513ms) KeyError('hos
t',) from (pid=45144) trace_logging_wrapper /opt/stack/cinder/cinder/utils.py:853
2016-08-04 13:56:42.188 ERROR cinder.volume.manager [req-6a382dfe-d1a5-47e7-99bc-e2a383124cd8 aa5ab308cd5b47eb9b2798ec9e2abb32 4b136f9898994fec81393c3b8210980b] Terminate volume connection failed: 'host'
2016-08-04 13:56:42.188 TRACE cinder.volume.manager Traceback (most recent call last):
2016-08-04 13:56:42.188 TRACE cinder.volume.manager File "/opt/stack/cinder/cinder/volume/manager.py", line 1457, in terminate_connection
2016-08-04 13:56:42.188 TRACE cinder.volume.manager force=force)
2016-08-04 13:56:42.188 TRACE cinder.volume.manager File "/opt/stack/cinder/cinder/utils.py", line 847, in trace_logging_wrapper
2016-08-04 13:56:42.188 TRACE cinder.volume.manager result = f(*args, **kwargs)
2016-08-04 13:56:42.188 TRACE cinder.volume.manager File "/opt/stack/cinder/cinder/volume/drivers/hpe/hpe_3par_iscsi.py", line 478, in terminate_connection
2016-08-04 13:56:42.188 TRACE cinder.volume.manager hostname = common._safe_hostname(connector['host'])
2016-08-04 13:56:42.188 TRACE cinder.volume.manager KeyError: 'host'
2016-08-04 13:56:42.188 TRACE cinder.volume.manager
2016-08-04 13:56:42.193 ERROR oslo_messaging.rpc.server [req-6a382dfe-d1a5-47e7-99bc-e2a383124cd8 aa5ab308cd5b47eb9b2798ec9e2abb32 4b136f9898994fec81393c3b8210980b] Exception during message handling
2016-08-04 13:56:42.193 TRACE oslo_messaging.rpc.server Traceback (most recent call last):
2016-08-04 13:56:42.193 TRACE oslo_messaging.rpc.server File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/server.py", line 133, in _process_incoming
2016-08-04 13:56:42.193 TRACE oslo_messaging.rpc.server res = self.dispatcher.dispatch(message)
2016-08-04 13:56:42.193 TRACE oslo_messaging.rpc.server File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 150, in dispatch
2016-08-04 13:56:42.193 TRACE oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args)
2016-08-04 13:56:42.193 TRACE oslo_messaging.rpc.server File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 121, in _do_dispatch
2016-08-04 13:56:42.193 TRACE oslo_messaging.rpc.server result = func(ctxt, **new_args)
2016-08-04 13:56:42.193 TRACE oslo_messaging.rpc.server File "/opt/stack/cinder/cinder/volume/manager.py", line 1462, in terminate_connection
2016-08-04 13:56:42.193 TRACE oslo_messaging.rpc.server raise exception.VolumeBackendAPIException(data=err_msg)
2016-08-04 13:56:42.193 TRACE oslo_messaging.rpc.server VolumeBackendAPIException: Bad or unexpected response from the storage volume backend API: Terminate volume connection failed: 'host'
2016-08-04 13:56:42.193 TRACE oslo_messaging.rpc.server
To manage notifications about this bug go to:
https://bugs.launchpad.net/nova/+bug/1609984/+subscriptions
References