yahoo-eng-team team mailing list archive
-
yahoo-eng-team team
-
Mailing list archive
-
Message #78994
[Bug 1833504] Re: Nova evacuate failed with Cinder driver backend HP3PAR 8000 ISCSI Multipath
/var/log/cinder/api.log:2019-06-19 15:23:02.316 10647 ERROR
cinder.api.middleware.fault [req-a350dc82-f052-48b5-a834-dcf550fd5089
be29df3f4eae49c789232c0921d3fe90 04d0321b6b9f4f17bf093f0c1c919a30 -
default default] Caught error: <class
'oslo_messaging.exceptions.MessagingTimeout'> Timed out waiting for a
reply to message ID 0f7e5118842547c7ae903bb7aea8d4a9: MessagingTimeout:
Timed out waiting for a reply to message ID
0f7e5118842547c7ae903bb7aea8d4a9
+cinder -nova
You can see from the above snippet that c-api is timing out waiting for
c-vol and this results in c-api returning 500 to n-api that ultimatley
leaves the instance in an ERROR state. I assume the c-api to c-vol call
is terminate_connection so we'd need to see the associated c-vol logs
here to find out why this is timing out. Adding Cinder and dropping Nova
for now until there's clear evidence of Nova calling c-api or os-brick
incorrectly.
** Also affects: cinder
Importance: Undecided
Status: New
** No longer affects: cinder
** Project changed: nova => cinder
--
You received this bug notification because you are a member of Yahoo!
Engineering Team, which is subscribed to OpenStack Compute (nova).
https://bugs.launchpad.net/bugs/1833504
Title:
Nova evacuate failed with Cinder driver backend HP3PAR 8000 ISCSI
Multipath
Status in Cinder:
New
Bug description:
Description of problem:
We deployed 3controller + 2 compute node Opnestack version, Queen.
Configured Cinder Volume service to use HP3par SAN connected using
iscsi and configured multipath. After configuring the Multipath
instance evacuation fails while performing evacuate operation and VM
going into rebuilt state and went to Error State. But it was working
fine before multipath is configured.
Version-Release number of selected component (if applicable):
nova versions.
[root@xxx ~]# rpm -qa | grep nova
openstack-nova-placement-api-17.0.10-1.el7.noarch
python-nova-17.0.10-1.el7.noarch
openstack-nova-novncproxy-17.0.10-1.el7.noarch
openstack-nova-console-17.0.10-1.el7.noarch
openstack-nova-compute-17.0.10-1.el7.noarch
python2-novaclient-10.1.0-1.el7.noarch
openstack-nova-scheduler-17.0.10-1.el7.noarch
openstack-nova-api-17.0.10-1.el7.noarch
openstack-nova-common-17.0.10-1.el7.noarch
openstack-nova-conductor-17.0.10-1.el7.noarch
Cinder version
[root@xxxx ~]# rpm -qa | grep cinder
python-cinder-12.0.7-1.el7.noarch
python2-cinderclient-3.5.0-1.el7.noarch
openstack-cinder-12.0.7-1.el7.noarch
How reproducible:
Steps to Reproduce:
1. Enable Multipath in cinder configuration file
2. Down one controller
3. evacuate VM to other working Controller
Actual results:
VM evacuation fails
Expected results:
VM evacuation should work
Additional info:
Below is the Openstack Dashboard ERROR log error:/
Error: Failed to perform requested operation on instance "ctrl-01",
the instance has an error status: Please try again later [Error: The
server has either erred or is incapable of performing the requested
operation. (HTTP 500) (Request-ID: req-
a350dc82-f052-48b5-a834-dcf550fd5089)].
NOVA log:
2019-06-19 15:21:49.379 45813 INFO nova.compute.manager [req-e1685895-04dc-44f7-9735-a5de7b1fb61c be29df3f4eae49c789232c0921d3fe90 04d0321b6b9f4f17bf093f0c1c919a30 - default default] [instance: 1616677a-4600-4400-a561-5502c2198eaa] Rebuilding instance
2019-06-19 15:21:49.494 45813 INFO nova.compute.claims [req-e1685895-04dc-44f7-9735-a5de7b1fb61c be29df3f4eae49c789232c0921d3fe90 04d0321b6b9f4f17bf093f0c1c919a30 - default default] [instance: 1616677a-4600-4400-a561-5502c2198eaa] Attempting claim on node mdu-ctrlha-02.local: memory 2048 MB, disk 20 GB, vcpus 2 CPU
2019-06-19 15:21:49.495 45813 INFO nova.compute.claims [req-e1685895-04dc-44f7-9735-a5de7b1fb61c be29df3f4eae49c789232c0921d3fe90 04d0321b6b9f4f17bf093f0c1c919a30 - default default] [instance: 1616677a-4600-4400-a561-5502c2198eaa] Total memory: 786338 MB, used: 2560.00 MB
2019-06-19 15:21:49.495 45813 INFO nova.compute.claims [req-e1685895-04dc-44f7-9735-a5de7b1fb61c be29df3f4eae49c789232c0921d3fe90 04d0321b6b9f4f17bf093f0c1c919a30 - default default] [instance: 1616677a-4600-4400-a561-5502c2198eaa] memory limit not specified, defaulting to unlimited
2019-06-19 15:21:49.496 45813 INFO nova.compute.claims [req-e1685895-04dc-44f7-9735-a5de7b1fb61c be29df3f4eae49c789232c0921d3fe90 04d0321b6b9f4f17bf093f0c1c919a30 - default default] [instance: 1616677a-4600-4400-a561-5502c2198eaa] Total disk: 236 GB, used: 20.00 GB
2019-06-19 15:21:49.496 45813 INFO nova.compute.claims [req-e1685895-04dc-44f7-9735-a5de7b1fb61c be29df3f4eae49c789232c0921d3fe90 04d0321b6b9f4f17bf093f0c1c919a30 - default default] [instance: 1616677a-4600-4400-a561-5502c2198eaa] disk limit not specified, defaulting to unlimited
2019-06-19 15:21:49.497 45813 INFO nova.compute.claims [req-e1685895-04dc-44f7-9735-a5de7b1fb61c be29df3f4eae49c789232c0921d3fe90 04d0321b6b9f4f17bf093f0c1c919a30 - default default] [instance: 1616677a-4600-4400-a561-5502c2198eaa] Total vcpu: 32 VCPU, used: 2.00 VCPU
2019-06-19 15:21:49.497 45813 INFO nova.compute.claims [req-e1685895-04dc-44f7-9735-a5de7b1fb61c be29df3f4eae49c789232c0921d3fe90 04d0321b6b9f4f17bf093f0c1c919a30 - default default] [instance: 1616677a-4600-4400-a561-5502c2198eaa] vcpu limit not specified, defaulting to unlimited
2019-06-19 15:21:49.499 45813 INFO nova.compute.claims [req-e1685895-04dc-44f7-9735-a5de7b1fb61c be29df3f4eae49c789232c0921d3fe90 04d0321b6b9f4f17bf093f0c1c919a30 - default default] [instance: 1616677a-4600-4400-a561-5502c2198eaa] Claim successful on node mdu-ctrlha-02.local
2019-06-19 15:21:49.649 45813 INFO nova.compute.resource_tracker [req-e1685895-04dc-44f7-9735-a5de7b1fb61c be29df3f4eae49c789232c0921d3fe90 04d0321b6b9f4f17bf093f0c1c919a30 - default default] Updating from migration 1616677a-4600-4400-a561-5502c2198eaa
2019-06-19 15:21:49.738 45813 INFO nova.compute.manager [req-e1685895-04dc-44f7-9735-a5de7b1fb61c be29df3f4eae49c789232c0921d3fe90 04d0321b6b9f4f17bf093f0c1c919a30 - default default] disk not on shared storage, rebuilding from: ''
2019-06-19 15:21:53.026 45813 WARNING nova.compute.manager [req-9a28e725-d05e-400e-a2ee-576e4fd510f6 a9eeb9d45c38402bb43b3059418ea0eb 139f542239b444098a9831d2dfa0f6ed - default default] [instance: 1616677a-4600-4400-a561-5502c2198eaa] Received unexpected event network-vif-unplugged-62a68b56-4154-4778-805c-4102252e546a for instance with vm_state stopped and task_state rebuilding.
2019-06-19 15:21:59.246 45813 INFO nova.network.neutronv2.api [req-e1685895-04dc-44f7-9735-a5de7b1fb61c be29df3f4eae49c789232c0921d3fe90 04d0321b6b9f4f17bf093f0c1c919a30 - default default] [instance: 1616677a-4600-4400-a561-5502c2198eaa] Updating port 62a68b56-4154-4778-805c-4102252e546a with attributes {'binding:profile': {}, 'device_owner': u'compute:mdu-ctrlha-02.local', 'binding:host_id': 'mdu-ctrlha-02.local'}
2019-06-19 15:22:02.153 45813 INFO nova.compute.manager [req-e1685895-04dc-44f7-9735-a5de7b1fb61c be29df3f4eae49c789232c0921d3fe90 04d0321b6b9f4f17bf093f0c1c919a30 - default default] [instance: 1616677a-4600-4400-a561-5502c2198eaa] Detaching volume ab8d57f2-0f73-4000-8139-bf8606f2d296
2019-06-19 15:22:22.329 45813 INFO nova.compute.resource_tracker [req-5457052e-87de-4a3a-bc71-6bf7e6c36564 - - - - -] Updating from migration 1616677a-4600-4400-a561-5502c2198eaa
2019-06-19 15:22:22.476 45813 WARNING nova.compute.resource_tracker [req-5457052e-87de-4a3a-bc71-6bf7e6c36564 - - - - -] Instance 1616677a-4600-4400-a561-5502c2198eaa has been moved to another host mdu-ctrlha-01.local(mdu-ctrlha-01.local). There are allocations remaining against the source host that might need to be removed: {u'resources': {u'VCPU': 2, u'MEMORY_MB': 2048, u'DISK_GB': 20}}.
2019-06-19 15:22:22.476 45813 INFO nova.compute.resource_tracker [req-5457052e-87de-4a3a-bc71-6bf7e6c36564 - - - - -] Final resource view: name=mdu-ctrlha-02.local phys_ram=786338MB used_ram=4608MB phys_disk=236GB used_disk=40GB total_vcpus=32 used_vcpus=4 pci_stats=[]
2019-06-19 15:23:02.374 45813 ERROR nova.volume.cinder [req-e1685895-04dc-44f7-9735-a5de7b1fb61c be29df3f4eae49c789232c0921d3fe90 04d0321b6b9f4f17bf093f0c1c919a30 - default default] Delete attachment failed for attachment b761e823-a97f-4d91-9f99-b2aac7ac36f9. Error: The server has either erred or is incapable of performing the requested operation. (HTTP 500) (Request-ID: req-a350dc82-f052-48b5-a834-dcf550fd5089) Code: 500: ClientException: The server has either erred or is incapable of performing the requested operation. (HTTP 500) (Request-ID: req-a350dc82-f052-48b5-a834-dcf550fd5089)
2019-06-19 15:23:02.751 45813 ERROR nova.compute.manager [req-e1685895-04dc-44f7-9735-a5de7b1fb61c be29df3f4eae49c789232c0921d3fe90 04d0321b6b9f4f17bf093f0c1c919a30 - default default] [instance: 1616677a-4600-4400-a561-5502c2198eaa] Setting instance vm_state to ERROR: ClientException: The server has either erred or is incapable of performing the requested operation. (HTTP 500) (Request-ID: req-a350dc82-f052-48b5-a834-dcf550fd5089)
2019-06-19 15:23:02.751 45813 ERROR nova.compute.manager [instance: 1616677a-4600-4400-a561-5502c2198eaa] Traceback (most recent call last):
2019-06-19 15:23:02.751 45813 ERROR nova.compute.manager [instance: 1616677a-4600-4400-a561-5502c2198eaa] File "/usr/lib/python2.7/site-packages/nova/compute/manager.py", line 7599, in _error_out_instance_on_exception
2019-06-19 15:23:02.751 45813 ERROR nova.compute.manager [instance: 1616677a-4600-4400-a561-5502c2198eaa] yield
2019-06-19 15:23:02.751 45813 ERROR nova.compute.manager [instance: 1616677a-4600-4400-a561-5502c2198eaa] File "/usr/lib/python2.7/site-packages/nova/compute/manager.py", line 2925, in rebuild_instance
2019-06-19 15:23:02.751 45813 ERROR nova.compute.manager [instance: 1616677a-4600-4400-a561-5502c2198eaa] migration, request_spec)
2019-06-19 15:23:02.751 45813 ERROR nova.compute.manager [instance: 1616677a-4600-4400-a561-5502c2198eaa] File "/usr/lib/python2.7/site-packages/nova/compute/manager.py", line 2987, in _do_rebuild_instance_with_claim
2019-06-19 15:23:02.751 45813 ERROR nova.compute.manager [instance: 1616677a-4600-4400-a561-5502c2198eaa] self._do_rebuild_instance(*args, **kwargs)
2019-06-19 15:23:02.751 45813 ERROR nova.compute.manager [instance: 1616677a-4600-4400-a561-5502c2198eaa] File "/usr/lib/python2.7/site-packages/nova/compute/manager.py", line 3144, in _do_rebuild_instance
2019-06-19 15:23:02.751 45813 ERROR nova.compute.manager [instance: 1616677a-4600-4400-a561-5502c2198eaa] self._rebuild_default_impl(**kwargs)
2019-06-19 15:23:02.751 45813 ERROR nova.compute.manager [instance: 1616677a-4600-4400-a561-5502c2198eaa] File "/usr/lib/python2.7/site-packages/nova/compute/manager.py", line 2810, in _rebuild_default_impl
2019-06-19 15:23:02.751 45813 ERROR nova.compute.manager [instance: 1616677a-4600-4400-a561-5502c2198eaa] detach_block_devices(context, bdms)
2019-06-19 15:23:02.751 45813 ERROR nova.compute.manager [instance: 1616677a-4600-4400-a561-5502c2198eaa] File "/usr/lib/python2.7/site-packages/nova/compute/manager.py", line 3117, in detach_block_devices
2019-06-19 15:23:02.751 45813 ERROR nova.compute.manager [instance: 1616677a-4600-4400-a561-5502c2198eaa] destroy_bdm=False)
2019-06-19 15:23:02.751 45813 ERROR nova.compute.manager [instance: 1616677a-4600-4400-a561-5502c2198eaa] File "/usr/lib/python2.7/site-packages/nova/compute/manager.py", line 5425, in _detach_volume
2019-06-19 15:23:02.751 45813 ERROR nova.compute.manager [instance: 1616677a-4600-4400-a561-5502c2198eaa] attachment_id=attachment_id, destroy_bdm=destroy_bdm)
2019-06-19 15:23:02.751 45813 ERROR nova.compute.manager [instance: 1616677a-4600-4400-a561-5502c2198eaa] File "/usr/lib/python2.7/site-packages/nova/virt/block_device.py", line 416, in detach
2019-06-19 15:23:02.751 45813 ERROR nova.compute.manager [instance: 1616677a-4600-4400-a561-5502c2198eaa] attachment_id, destroy_bdm)
2019-06-19 15:23:02.751 45813 ERROR nova.compute.manager [instance: 1616677a-4600-4400-a561-5502c2198eaa] File "/usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py", line 274, in inner
2019-06-19 15:23:02.751 45813 ERROR nova.compute.manager [instance: 1616677a-4600-4400-a561-5502c2198eaa] return f(*args, **kwargs)
2019-06-19 15:23:02.751 45813 ERROR nova.compute.manager [instance: 1616677a-4600-4400-a561-5502c2198eaa] File "/usr/lib/python2.7/site-packages/nova/virt/block_device.py", line 413, in _do_locked_detach
2019-06-19 15:23:02.751 45813 ERROR nova.compute.manager [instance: 1616677a-4600-4400-a561-5502c2198eaa] self._do_detach(*args, **_kwargs)
2019-06-19 15:23:02.751 45813 ERROR nova.compute.manager [instance: 1616677a-4600-4400-a561-5502c2198eaa] File "/usr/lib/python2.7/site-packages/nova/virt/block_device.py", line 401, in _do_detach
2019-06-19 15:23:02.751 45813 ERROR nova.compute.manager [instance: 1616677a-4600-4400-a561-5502c2198eaa] volume_api.attachment_delete(context, self['attachment_id'])
2019-06-19 15:23:02.751 45813 ERROR nova.compute.manager [instance: 1616677a-4600-4400-a561-5502c2198eaa] File "/usr/lib/python2.7/site-packages/nova/volume/cinder.py", line 379, in wrapper
2019-06-19 15:23:02.751 45813 ERROR nova.compute.manager [instance: 1616677a-4600-4400-a561-5502c2198eaa] res = method(self, ctx, *args, **kwargs)
2019-06-19 15:23:02.751 45813 ERROR nova.compute.manager [instance: 1616677a-4600-4400-a561-5502c2198eaa] File "/usr/lib/python2.7/site-packages/nova/volume/cinder.py", line 415, in wrapper
2019-06-19 15:23:02.751 45813 ERROR nova.compute.manager [instance: 1616677a-4600-4400-a561-5502c2198eaa] res = method(self, ctx, attachment_id, *args, **kwargs)
2019-06-19 15:23:02.751 45813 ERROR nova.compute.manager [instance: 1616677a-4600-4400-a561-5502c2198eaa] File "/usr/lib/python2.7/site-packages/nova/volume/cinder.py", line 841, in attachment_delete
2019-06-19 15:23:02.751 45813 ERROR nova.compute.manager [instance: 1616677a-4600-4400-a561-5502c2198eaa] 'code': getattr(ex, 'code', None)})
2019-06-19 15:23:02.751 45813 ERROR nova.compute.manager [instance: 1616677a-4600-4400-a561-5502c2198eaa] File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 220, in __exit__
2019-06-19 15:23:02.751 45813 ERROR nova.compute.manager [instance: 1616677a-4600-4400-a561-5502c2198eaa] self.force_reraise()
2019-06-19 15:23:02.751 45813 ERROR nova.compute.manager [instance: 1616677a-4600-4400-a561-5502c2198eaa] File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 196, in force_reraise
2019-06-19 15:23:02.751 45813 ERROR nova.compute.manager [instance: 1616677a-4600-4400-a561-5502c2198eaa] six.reraise(self.type_, self.value, self.tb)
2019-06-19 15:23:02.751 45813 ERROR nova.compute.manager [instance: 1616677a-4600-4400-a561-5502c2198eaa] File "/usr/lib/python2.7/site-packages/nova/volume/cinder.py", line 834, in attachment_delete
2019-06-19 15:23:02.751 45813 ERROR nova.compute.manager [instance: 1616677a-4600-4400-a561-5502c2198eaa] attachment_id)
2019-06-19 15:23:02.751 45813 ERROR nova.compute.manager [instance: 1616677a-4600-4400-a561-5502c2198eaa] File "/usr/lib/python2.7/site-packages/cinderclient/v3/attachments.py", line 39, in delete
2019-06-19 15:23:02.751 45813 ERROR nova.compute.manager [instance: 1616677a-4600-4400-a561-5502c2198eaa] return self._delete("/attachments/%s" % base.getid(attachment))
2019-06-19 15:23:02.751 45813 ERROR nova.compute.manager [instance: 1616677a-4600-4400-a561-5502c2198eaa] File "/usr/lib/python2.7/site-packages/cinderclient/base.py", line 339, in _delete
2019-06-19 15:23:02.751 45813 ERROR nova.compute.manager [instance: 1616677a-4600-4400-a561-5502c2198eaa] resp, body = self.api.client.delete(url)
2019-06-19 15:23:02.751 45813 ERROR nova.compute.manager [instance: 1616677a-4600-4400-a561-5502c2198eaa] File "/usr/lib/python2.7/site-packages/cinderclient/client.py", line 209, in delete
2019-06-19 15:23:02.751 45813 ERROR nova.compute.manager [instance: 1616677a-4600-4400-a561-5502c2198eaa] return self._cs_request(url, 'DELETE', **kwargs)
2019-06-19 15:23:02.751 45813 ERROR nova.compute.manager [instance: 1616677a-4600-4400-a561-5502c2198eaa] File "/usr/lib/python2.7/site-packages/cinderclient/client.py", line 191, in _cs_request
2019-06-19 15:23:02.751 45813 ERROR nova.compute.manager [instance: 1616677a-4600-4400-a561-5502c2198eaa] return self.request(url, method, **kwargs)
2019-06-19 15:23:02.751 45813 ERROR nova.compute.manager [instance: 1616677a-4600-4400-a561-5502c2198eaa] File "/usr/lib/python2.7/site-packages/cinderclient/client.py", line 177, in request
2019-06-19 15:23:02.751 45813 ERROR nova.compute.manager [instance: 1616677a-4600-4400-a561-5502c2198eaa] raise exceptions.from_response(resp, body)
2019-06-19 15:23:02.751 45813 ERROR nova.compute.manager [instance: 1616677a-4600-4400-a561-5502c2198eaa] ClientException: The server has either erred or is incapable of performing the requested operation. (HTTP 500) (Request-ID: req-a350dc82-f052-48b5-a834-dcf550fd5089)
2019-06-19 15:23:02.751 45813 ERROR nova.compute.manager [instance: 1616677a-4600-4400-a561-5502c2198eaa]
2019-06-19 15:23:03.068 45813 INFO nova.compute.manager [req-e1685895-04dc-44f7-9735-a5de7b1fb61c be29df3f4eae49c789232c0921d3fe90 04d0321b6b9f4f17bf093f0c1c919a30 - default default] [instance: 1616677a-4600-4400-a561-5502c2198eaa] Successfully reverted task state from rebuilding on failure for instance.
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server [req-e1685895-04dc-44f7-9735-a5de7b1fb61c be29df3f4eae49c789232c0921d3fe90 04d0321b6b9f4f17bf093f0c1c919a30 - default default] Exception during message handling: ClientException: The server has either erred or is incapable of performing the requested operation. (HTTP 500) (Request-ID: req-a350dc82-f052-48b5-a834-dcf550fd5089)
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/server.py", line 163, in _process_incoming
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message)
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 220, in dispatch
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args)
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 190, in _do_dispatch
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args)
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/server.py", line 226, in inner
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server return func(*args, **kwargs)
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/nova/exception_wrapper.py", line 76, in wrapped
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server function_name, call_dict, binary)
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 220, in __exit__
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server self.force_reraise()
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 196, in force_reraise
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server six.reraise(self.type_, self.value, self.tb)
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/nova/exception_wrapper.py", line 67, in wrapped
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw)
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/nova/compute/manager.py", line 186, in decorated_function
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server "Error: %s", e, instance=instance)
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 220, in __exit__
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server self.force_reraise()
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 196, in force_reraise
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server six.reraise(self.type_, self.value, self.tb)
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/nova/compute/manager.py", line 156, in decorated_function
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs)
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/nova/compute/utils.py", line 977, in decorated_function
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs)
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/nova/compute/manager.py", line 214, in decorated_function
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server kwargs['instance'], e, sys.exc_info())
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 220, in __exit__
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server self.force_reraise()
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 196, in force_reraise
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server six.reraise(self.type_, self.value, self.tb)
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/nova/compute/manager.py", line 202, in decorated_function
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs)
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/nova/compute/manager.py", line 2925, in rebuild_instance
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server migration, request_spec)
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/nova/compute/manager.py", line 2987, in _do_rebuild_instance_with_claim
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server self._do_rebuild_instance(*args, **kwargs)
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/nova/compute/manager.py", line 3144, in _do_rebuild_instance
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server self._rebuild_default_impl(**kwargs)
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/nova/compute/manager.py", line 2810, in _rebuild_default_impl
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server detach_block_devices(context, bdms)
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/nova/compute/manager.py", line 3117, in detach_block_devices
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server destroy_bdm=False)
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/nova/compute/manager.py", line 5425, in _detach_volume
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server attachment_id=attachment_id, destroy_bdm=destroy_bdm)
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/nova/virt/block_device.py", line 416, in detach
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server attachment_id, destroy_bdm)
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py", line 274, in inner
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server return f(*args, **kwargs)
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/nova/virt/block_device.py", line 413, in _do_locked_detach
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server self._do_detach(*args, **_kwargs)
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/nova/virt/block_device.py", line 401, in _do_detach
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server volume_api.attachment_delete(context, self['attachment_id'])
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/nova/volume/cinder.py", line 379, in wrapper
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server res = method(self, ctx, *args, **kwargs)
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/nova/volume/cinder.py", line 415, in wrapper
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server res = method(self, ctx, attachment_id, *args, **kwargs)
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/nova/volume/cinder.py", line 841, in attachment_delete
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server 'code': getattr(ex, 'code', None)})
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 220, in __exit__
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server self.force_reraise()
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 196, in force_reraise
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server six.reraise(self.type_, self.value, self.tb)
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/nova/volume/cinder.py", line 834, in attachment_delete
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server attachment_id)
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/cinderclient/v3/attachments.py", line 39, in delete
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server return self._delete("/attachments/%s" % base.getid(attachment))
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/cinderclient/base.py", line 339, in _delete
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server resp, body = self.api.client.delete(url)
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/cinderclient/client.py", line 209, in delete
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server return self._cs_request(url, 'DELETE', **kwargs)
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/cinderclient/client.py", line 191, in _cs_request
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server return self.request(url, method, **kwargs)
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/cinderclient/client.py", line 177, in request
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server raise exceptions.from_response(resp, body)
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server ClientException: The server has either erred or is incapable of performing the requested operation. (HTTP 500) (Request-ID: req-a350dc82-f052-48b5-a834-dcf550fd5089)
2019-06-19 15:23:03.085 45813 ERROR oslo_messaging.rpc.server
Cinder-API Log :
==============
egrep 'req-a350dc82-f052-48b5-a834-dcf550fd5089' /var/log/cinder/*
/var/log/cinder/api.log:2019-06-19 15:22:02.228 10647 INFO
cinder.api.openstack.wsgi [req-a350dc82-f052-48b5-a834-dcf550fd5089
be29df3f4eae49c789232c0921d3fe90 04d0321b6b9f4f17bf093f0c1c919a30 -
default default] DELETE
http://mducontrollerha:8776/v3/04d0321b6b9f4f17bf093f0c1c919a30/attachments/b761e823
-a97f-4d91-9f99-b2aac7ac36f9
/var/log/cinder/api.log:2019-06-19 15:23:02.316 10647 ERROR
cinder.api.middleware.fault [req-a350dc82-f052-48b5-a834-dcf550fd5089
be29df3f4eae49c789232c0921d3fe90 04d0321b6b9f4f17bf093f0c1c919a30 -
default default] Caught error: <class
'oslo_messaging.exceptions.MessagingTimeout'> Timed out waiting for a
reply to message ID 0f7e5118842547c7ae903bb7aea8d4a9:
MessagingTimeout: Timed out waiting for a reply to message ID
0f7e5118842547c7ae903bb7aea8d4a9
/var/log/cinder/api.log:2019-06-19 15:23:02.363 10647 INFO
cinder.api.middleware.fault [req-a350dc82-f052-48b5-a834-dcf550fd5089
be29df3f4eae49c789232c0921d3fe90 04d0321b6b9f4f17bf093f0c1c919a30 -
default default]
http://mducontrollerha:8776/v3/04d0321b6b9f4f17bf093f0c1c919a30/attachments/b761e823
-a97f-4d91-9f99-b2aac7ac36f9 returned with HTTP 500
/var/log/cinder/api.log:2019-06-19 15:23:02.365 10647 INFO eventlet.wsgi.server [req-a350dc82-f052-48b5-a834-dcf550fd5089 be29df3f4eae49c789232c0921d3fe90 04d0321b6b9f4f17bf093f0c1c919a30 - default default] 10.200.6.4 "DELETE /v3/04d0321b6b9f4f17bf093f0c1c919a30/attachments/b761e823-a97f-4d91-9f99-b2aac7ac36f9 HTTP/1.1" status: 500 len: 475 time: 60.1395781
To manage notifications about this bug go to:
https://bugs.launchpad.net/cinder/+bug/1833504/+subscriptions
References