yahoo-eng-team team mailing list archive
-
yahoo-eng-team team
-
Mailing list archive
-
Message #77226
[Bug 1817933] [NEW] TestServerAdvancedOps.test_server_sequence_suspend_resume intermittently fails with "nova.exception.UnexpectedTaskStateError: Conflict updating instance 8a2a11db-4322-4b93-9d54-e7fb3c353370. Expected: {'task_state': ['suspending']}. Actual: {'task_state': None}"
Public bug reported:
Seen here:
http://logs.openstack.org/93/633293/13/check/tempest-slow-py3/b9ed6f3
/job-output.txt.gz#_2019-02-27_00_51_05_003004
2019-02-27 00:51:05.003004 | controller | {0} tempest.scenario.test_server_advanced_ops.TestServerAdvancedOps.test_server_sequence_suspend_resume [276.272117s] ... FAILED
2019-02-27 00:51:05.003093 | controller |
2019-02-27 00:51:05.003161 | controller | Captured traceback:
2019-02-27 00:51:05.003218 | controller | ~~~~~~~~~~~~~~~~~~~
2019-02-27 00:51:05.003319 | controller | b'Traceback (most recent call last):'
2019-02-27 00:51:05.003498 | controller | b' File "/opt/stack/tempest/tempest/common/utils/__init__.py", line 89, in wrapper'
2019-02-27 00:51:05.003605 | controller | b' return f(*func_args, **func_kwargs)'
2019-02-27 00:51:05.003853 | controller | b' File "/opt/stack/tempest/tempest/scenario/test_server_advanced_ops.py", line 56, in test_server_sequence_suspend_resume'
2019-02-27 00:51:05.003919 | controller | b" 'SUSPENDED')"
2019-02-27 00:51:05.004097 | controller | b' File "/opt/stack/tempest/tempest/common/waiters.py", line 96, in wait_for_server_status'
2019-02-27 00:51:05.004202 | controller | b' raise lib_exc.TimeoutException(message)'
2019-02-27 00:51:05.004330 | controller | b'tempest.lib.exceptions.TimeoutException: Request timed out'
2019-02-27 00:51:05.004768 | controller | b'Details: (TestServerAdvancedOps:test_server_sequence_suspend_resume) Server 8a2a11db-4322-4b93-9d54-e7fb3c353370 failed to reach SUSPENDED status and task state "None" within the required time (196 s). Current status: SHUTOFF. Current task state: None.'
2019-02-27 00:51:05.004806 | controller | b''
Looks like there was a race with suspending an instance where the
task_state was set to None between the time that the API changed it to
"suspending" and when the compute service tried to update the instance
in the database:
http://logs.openstack.org/93/633293/13/check/tempest-slow-
py3/b9ed6f3/compute1/logs/screen-n-cpu.txt.gz?level=TRACE#_Feb_27_00_47_48_526915
Feb 27 00:47:47.706484 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: WARNING nova.compute.manager [None req-7bc42882-04b4-491d-89cf-5a55ed27310e None None] [instance: 8a2a11db-4322-4b93-9d54-e7fb3c353370] Instance is paused unexpectedly. Ignore.
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server [None req-e189d281-4423-46f9-b1c3-a2216124b595 tempest-TestServerAdvancedOps-522090128 tempest-TestServerAdvancedOps-522090128] Exception during message handling: UnexpectedTaskStateError_Remote: Conflict updating instance 8a2a11db-4322-4b93-9d54-e7fb3c353370. Expected: {'task_state': ['suspending']}. Actual: {'task_state': None}
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: Traceback (most recent call last):
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: File "/opt/stack/nova/nova/db/sqlalchemy/api.py", line 2813, in _instance_update
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: update_on_match(compare, 'uuid', values)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: File "/usr/local/lib/python3.6/dist-packages/oslo_db/sqlalchemy/orm.py", line 53, in update_on_match
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: self, specimen, surrogate_key, values, **kw)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: File "/usr/local/lib/python3.6/dist-packages/oslo_db/sqlalchemy/update_match.py", line 194, in update_on_match
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: raise NoRowsMatched("Zero rows matched for %d attempts" % attempts)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: oslo_db.sqlalchemy.update_match.NoRowsMatched: Zero rows matched for 3 attempts
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: During handling of the above exception, another exception occurred:
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: Traceback (most recent call last):
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: File "/opt/stack/nova/nova/conductor/manager.py", line 129, in _object_dispatch
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: return getattr(target, method)(*args, **kwargs)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: File "/usr/local/lib/python3.6/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: return fn(self, *args, **kwargs)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: File "/opt/stack/nova/nova/objects/instance.py", line 847, in save
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: columns_to_join=_expected_cols(expected_attrs))
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: File "/opt/stack/nova/nova/db/api.py", line 889, in instance_update_and_get_original
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: expected=expected)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: File "/opt/stack/nova/nova/db/sqlalchemy/api.py", line 170, in wrapper
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: return f(*args, **kwargs)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: File "/usr/local/lib/python3.6/dist-packages/oslo_db/api.py", line 154, in wrapper
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ectxt.value = e.inner_exc
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: File "/usr/local/lib/python3.6/dist-packages/oslo_utils/excutils.py", line 220, in __exit__
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: self.force_reraise()
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: File "/usr/local/lib/python3.6/dist-packages/oslo_utils/excutils.py", line 196, in force_reraise
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: six.reraise(self.type_, self.value, self.tb)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: File "/usr/local/lib/python3.6/dist-packages/six.py", line 693, in reraise
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: raise value
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: File "/usr/local/lib/python3.6/dist-packages/oslo_db/api.py", line 142, in wrapper
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: return f(*args, **kwargs)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: File "/opt/stack/nova/nova/db/sqlalchemy/api.py", line 227, in wrapped
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: return f(context, *args, **kwargs)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: File "/opt/stack/nova/nova/db/sqlalchemy/api.py", line 2736, in instance_update_and_get_original
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: context, instance_uuid, values, expected, original=instance_ref))
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: File "/opt/stack/nova/nova/db/sqlalchemy/api.py", line 2872, in _instance_update
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: raise exc(**exc_props)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: nova.exception.UnexpectedTaskStateError: Conflict updating instance 8a2a11db-4322-4b93-9d54-e7fb3c353370. Expected: {'task_state': ['suspending']}. Actual: {'task_state': None}
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server Traceback (most recent call last):
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/server.py", line 166, in _process_incoming
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 265, in dispatch
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 194, in _do_dispatch
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 79, in wrapped
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server function_name, call_dict, binary, tb)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server File "/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 220, in __exit__
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server self.force_reraise()
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server File "/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 196, in force_reraise
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server six.reraise(self.type_, self.value, self.tb)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 69, in wrapped
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 162, in decorated_function
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server e.format_message())
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server File "/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 220, in __exit__
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server self.force_reraise()
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server File "/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 196, in force_reraise
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server six.reraise(self.type_, self.value, self.tb)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 155, in decorated_function
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1301, in decorated_function
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server kwargs['instance'], e, sys.exc_info())
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server File "/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 220, in __exit__
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server self.force_reraise()
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server File "/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 196, in force_reraise
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server six.reraise(self.type_, self.value, self.tb)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 201, in decorated_function
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 4951, in suspend_instance
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server instance.save(expected_task_state=task_states.SUSPENDING)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server File "/usr/local/lib/python2.7/dist-packages/oslo_versionedobjects/base.py", line 210, in wrapper
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server ctxt, self, fn.__name__, args, kwargs)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/conductor/rpcapi.py", line 246, in object_action
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server objmethod=objmethod, args=args, kwargs=kwargs)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/client.py", line 178, in call
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server retry=self.retry)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/transport.py", line 128, in _send
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server retry=retry)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 645, in send
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server call_monitor_timeout, retry=retry)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 636, in _send
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server raise result
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server UnexpectedTaskStateError_Remote: Conflict updating instance 8a2a11db-4322-4b93-9d54-e7fb3c353370. Expected: {'task_state': ['suspending']}. Actual: {'task_state': None}
I haven't dug into the logs yet to find out what other requests are
working on that instance at the same time.
This also looks to be pretty rare in the gate:
http://logstash.openstack.org/#dashboard/file/logstash.json?query=message%3A%5C%22nova.exception.UnexpectedTaskStateError%3A%20Conflict%20updating%20instance%5C%22%20AND%20message%3A%5C%22Expected%3A%20%7B'task_state'%3A%20%5B'suspending'%5D%7D.%20Actual%3A%20%7B'task_state'%3A%20None%7D%5C%22%20AND%20tags%3A%5C%22screen-n-cpu.txt%5C%22&from=7d
** Affects: nova
Importance: Undecided
Status: New
** Tags: gate-failure race-condition
--
You received this bug notification because you are a member of Yahoo!
Engineering Team, which is subscribed to OpenStack Compute (nova).
https://bugs.launchpad.net/bugs/1817933
Title:
TestServerAdvancedOps.test_server_sequence_suspend_resume
intermittently fails with "nova.exception.UnexpectedTaskStateError:
Conflict updating instance 8a2a11db-4322-4b93-9d54-e7fb3c353370.
Expected: {'task_state': ['suspending']}. Actual: {'task_state':
None}"
Status in OpenStack Compute (nova):
New
Bug description:
Seen here:
http://logs.openstack.org/93/633293/13/check/tempest-slow-py3/b9ed6f3
/job-output.txt.gz#_2019-02-27_00_51_05_003004
2019-02-27 00:51:05.003004 | controller | {0} tempest.scenario.test_server_advanced_ops.TestServerAdvancedOps.test_server_sequence_suspend_resume [276.272117s] ... FAILED
2019-02-27 00:51:05.003093 | controller |
2019-02-27 00:51:05.003161 | controller | Captured traceback:
2019-02-27 00:51:05.003218 | controller | ~~~~~~~~~~~~~~~~~~~
2019-02-27 00:51:05.003319 | controller | b'Traceback (most recent call last):'
2019-02-27 00:51:05.003498 | controller | b' File "/opt/stack/tempest/tempest/common/utils/__init__.py", line 89, in wrapper'
2019-02-27 00:51:05.003605 | controller | b' return f(*func_args, **func_kwargs)'
2019-02-27 00:51:05.003853 | controller | b' File "/opt/stack/tempest/tempest/scenario/test_server_advanced_ops.py", line 56, in test_server_sequence_suspend_resume'
2019-02-27 00:51:05.003919 | controller | b" 'SUSPENDED')"
2019-02-27 00:51:05.004097 | controller | b' File "/opt/stack/tempest/tempest/common/waiters.py", line 96, in wait_for_server_status'
2019-02-27 00:51:05.004202 | controller | b' raise lib_exc.TimeoutException(message)'
2019-02-27 00:51:05.004330 | controller | b'tempest.lib.exceptions.TimeoutException: Request timed out'
2019-02-27 00:51:05.004768 | controller | b'Details: (TestServerAdvancedOps:test_server_sequence_suspend_resume) Server 8a2a11db-4322-4b93-9d54-e7fb3c353370 failed to reach SUSPENDED status and task state "None" within the required time (196 s). Current status: SHUTOFF. Current task state: None.'
2019-02-27 00:51:05.004806 | controller | b''
Looks like there was a race with suspending an instance where the
task_state was set to None between the time that the API changed it to
"suspending" and when the compute service tried to update the instance
in the database:
http://logs.openstack.org/93/633293/13/check/tempest-slow-
py3/b9ed6f3/compute1/logs/screen-n-cpu.txt.gz?level=TRACE#_Feb_27_00_47_48_526915
Feb 27 00:47:47.706484 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: WARNING nova.compute.manager [None req-7bc42882-04b4-491d-89cf-5a55ed27310e None None] [instance: 8a2a11db-4322-4b93-9d54-e7fb3c353370] Instance is paused unexpectedly. Ignore.
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server [None req-e189d281-4423-46f9-b1c3-a2216124b595 tempest-TestServerAdvancedOps-522090128 tempest-TestServerAdvancedOps-522090128] Exception during message handling: UnexpectedTaskStateError_Remote: Conflict updating instance 8a2a11db-4322-4b93-9d54-e7fb3c353370. Expected: {'task_state': ['suspending']}. Actual: {'task_state': None}
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: Traceback (most recent call last):
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: File "/opt/stack/nova/nova/db/sqlalchemy/api.py", line 2813, in _instance_update
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: update_on_match(compare, 'uuid', values)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: File "/usr/local/lib/python3.6/dist-packages/oslo_db/sqlalchemy/orm.py", line 53, in update_on_match
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: self, specimen, surrogate_key, values, **kw)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: File "/usr/local/lib/python3.6/dist-packages/oslo_db/sqlalchemy/update_match.py", line 194, in update_on_match
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: raise NoRowsMatched("Zero rows matched for %d attempts" % attempts)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: oslo_db.sqlalchemy.update_match.NoRowsMatched: Zero rows matched for 3 attempts
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: During handling of the above exception, another exception occurred:
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: Traceback (most recent call last):
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: File "/opt/stack/nova/nova/conductor/manager.py", line 129, in _object_dispatch
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: return getattr(target, method)(*args, **kwargs)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: File "/usr/local/lib/python3.6/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: return fn(self, *args, **kwargs)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: File "/opt/stack/nova/nova/objects/instance.py", line 847, in save
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: columns_to_join=_expected_cols(expected_attrs))
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: File "/opt/stack/nova/nova/db/api.py", line 889, in instance_update_and_get_original
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: expected=expected)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: File "/opt/stack/nova/nova/db/sqlalchemy/api.py", line 170, in wrapper
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: return f(*args, **kwargs)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: File "/usr/local/lib/python3.6/dist-packages/oslo_db/api.py", line 154, in wrapper
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ectxt.value = e.inner_exc
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: File "/usr/local/lib/python3.6/dist-packages/oslo_utils/excutils.py", line 220, in __exit__
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: self.force_reraise()
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: File "/usr/local/lib/python3.6/dist-packages/oslo_utils/excutils.py", line 196, in force_reraise
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: six.reraise(self.type_, self.value, self.tb)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: File "/usr/local/lib/python3.6/dist-packages/six.py", line 693, in reraise
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: raise value
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: File "/usr/local/lib/python3.6/dist-packages/oslo_db/api.py", line 142, in wrapper
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: return f(*args, **kwargs)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: File "/opt/stack/nova/nova/db/sqlalchemy/api.py", line 227, in wrapped
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: return f(context, *args, **kwargs)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: File "/opt/stack/nova/nova/db/sqlalchemy/api.py", line 2736, in instance_update_and_get_original
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: context, instance_uuid, values, expected, original=instance_ref))
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: File "/opt/stack/nova/nova/db/sqlalchemy/api.py", line 2872, in _instance_update
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: raise exc(**exc_props)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: nova.exception.UnexpectedTaskStateError: Conflict updating instance 8a2a11db-4322-4b93-9d54-e7fb3c353370. Expected: {'task_state': ['suspending']}. Actual: {'task_state': None}
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server Traceback (most recent call last):
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/server.py", line 166, in _process_incoming
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 265, in dispatch
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 194, in _do_dispatch
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 79, in wrapped
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server function_name, call_dict, binary, tb)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server File "/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 220, in __exit__
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server self.force_reraise()
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server File "/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 196, in force_reraise
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server six.reraise(self.type_, self.value, self.tb)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 69, in wrapped
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 162, in decorated_function
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server e.format_message())
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server File "/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 220, in __exit__
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server self.force_reraise()
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server File "/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 196, in force_reraise
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server six.reraise(self.type_, self.value, self.tb)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 155, in decorated_function
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1301, in decorated_function
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server kwargs['instance'], e, sys.exc_info())
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server File "/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 220, in __exit__
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server self.force_reraise()
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server File "/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 196, in force_reraise
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server six.reraise(self.type_, self.value, self.tb)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 201, in decorated_function
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 4951, in suspend_instance
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server instance.save(expected_task_state=task_states.SUSPENDING)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server File "/usr/local/lib/python2.7/dist-packages/oslo_versionedobjects/base.py", line 210, in wrapper
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server ctxt, self, fn.__name__, args, kwargs)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/conductor/rpcapi.py", line 246, in object_action
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server objmethod=objmethod, args=args, kwargs=kwargs)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/client.py", line 178, in call
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server retry=self.retry)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/transport.py", line 128, in _send
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server retry=retry)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 645, in send
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server call_monitor_timeout, retry=retry)
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 636, in _send
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server raise result
Feb 27 00:47:48.526915 ubuntu-bionic-inap-mtl01-0003055555 nova-compute[17274]: ERROR oslo_messaging.rpc.server UnexpectedTaskStateError_Remote: Conflict updating instance 8a2a11db-4322-4b93-9d54-e7fb3c353370. Expected: {'task_state': ['suspending']}. Actual: {'task_state': None}
I haven't dug into the logs yet to find out what other requests are
working on that instance at the same time.
This also looks to be pretty rare in the gate:
http://logstash.openstack.org/#dashboard/file/logstash.json?query=message%3A%5C%22nova.exception.UnexpectedTaskStateError%3A%20Conflict%20updating%20instance%5C%22%20AND%20message%3A%5C%22Expected%3A%20%7B'task_state'%3A%20%5B'suspending'%5D%7D.%20Actual%3A%20%7B'task_state'%3A%20None%7D%5C%22%20AND%20tags%3A%5C%22screen-n-cpu.txt%5C%22&from=7d
To manage notifications about this bug go to:
https://bugs.launchpad.net/nova/+bug/1817933/+subscriptions
Follow ups