← Back to team overview

yahoo-eng-team team mailing list archive

[Bug 1794558] Re: Tempest test AttachInterfacesUnderV243Test results in FixedIpNotFoundForSpecificInstance traceback in n-cpu logs

 

This is the port delete request:

http://logs.openstack.org/98/604898/2/check/nova-
next/df58e8a/logs/screen-q-svc.txt.gz#_Sep_26_00_20_10_283226

Sep 26 00:20:10.283226 ubuntu-xenial-ovh-bhs1-0002284194 neutron-
server[24409]: DEBUG neutron.plugins.ml2.plugin [None req-8e9ab2d9-25b2
-452e-8fe1-f66a7b065773 tempest-AttachInterfacesUnderV243Test-1609135526
tempest-AttachInterfacesUnderV243Test-1609135526] Deleting port 2ac9b1d5
-ba6f-4cbb-a867-bdb61e176421 {{(pid=24781) _pre_delete_port
/opt/stack/new/neutron/neutron/plugins/ml2/plugin.py:1535}}

I don't see req-8e9ab2d9-25b2-452e-8fe1-f66a7b065773 in the tempest logs
nor the n-cpu logs, so I'm not sure what is requesting the port to be
deleted.

Looks like it's probably nova because the instance is being deleted and
the vif is being unplugged:

http://logs.openstack.org/98/604898/2/check/nova-
next/df58e8a/logs/screen-n-cpu.txt.gz#_Sep_26_00_20_09_945815

Sep 26 00:20:09.945815 ubuntu-xenial-ovh-bhs1-0002284194 nova-
compute[521]: INFO os_vif [None req-5f7ba3e6-6d08-4457-970d-7b3ee9b4015a
tempest-AttachInterfacesUnderV243Test-1609135526 tempest-
AttachInterfacesUnderV243Test-1609135526] Successfully unplugged vif
VIFOpenVSwitch(active=False,address=fa:16:3e:ef:cf:c0,bridge_name='br-
int',has_traffic_filtering=True,id=2ac9b1d5-ba6f-
4cbb-a867-bdb61e176421,network=Network(fedfa4f2-f0a8-4649-833f-
48dbf3aa0f15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ac9b1d5-ba')

Oh I get the failure now...tempest is sending the removeFixedIp request
which is asynchronous, but not waiting for it to complete and then
enters it's tearDown routine which deletes the server (and port) before
the removeFixedIp request is processed by nova-compute, at which point
it complains about the fixed IP no longer being associated with the
instance. So this is a testing bug.

** Also affects: tempest
   Importance: Undecided
       Status: New

** Changed in: nova
       Status: Confirmed => Invalid

** Changed in: tempest
       Status: New => Triaged

** Changed in: tempest
   Importance: Undecided => Medium

-- 
You received this bug notification because you are a member of Yahoo!
Engineering Team, which is subscribed to OpenStack Compute (nova).
https://bugs.launchpad.net/bugs/1794558

Title:
  Tempest test AttachInterfacesUnderV243Test results in
  FixedIpNotFoundForSpecificInstance traceback in n-cpu logs

Status in OpenStack Compute (nova):
  Invalid
Status in tempest:
  Triaged

Bug description:
  This new Tempest change was recently merged:

  https://review.openstack.org/#/c/587734/

  And results in a traceback in the n-cpu logs:

  http://logs.openstack.org/98/604898/2/check/nova-
  next/df58e8a/logs/screen-n-cpu.txt.gz?level=TRACE#_Sep_26_00_20_14_150429

  Sep 26 00:20:14.150429 ubuntu-xenial-ovh-bhs1-0002284194 nova-compute[521]: ERROR oslo_messaging.rpc.server [None req-7aa4027c-2550-461f-a293-9b89d479f760 tempest-AttachInterfacesUnderV243Test-1609135526 tempest-AttachInterfacesUnderV243Test-1609135526] Exception during message handling: FixedIpNotFoundForSpecificInstance: Instance 609c0565-d193-445c-be8f-4667eecbf2f4 doesn't have fixed IP '10.1.0.4'.
  Sep 26 00:20:14.150678 ubuntu-xenial-ovh-bhs1-0002284194 nova-compute[521]: ERROR oslo_messaging.rpc.server Traceback (most recent call last):
  Sep 26 00:20:14.150905 ubuntu-xenial-ovh-bhs1-0002284194 nova-compute[521]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/server.py", line 163, in _process_incoming
  Sep 26 00:20:14.151124 ubuntu-xenial-ovh-bhs1-0002284194 nova-compute[521]: ERROR oslo_messaging.rpc.server     res = self.dispatcher.dispatch(message)
  Sep 26 00:20:14.151352 ubuntu-xenial-ovh-bhs1-0002284194 nova-compute[521]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 265, in dispatch
  Sep 26 00:20:14.151572 ubuntu-xenial-ovh-bhs1-0002284194 nova-compute[521]: ERROR oslo_messaging.rpc.server     return self._do_dispatch(endpoint, method, ctxt, args)
  Sep 26 00:20:14.151792 ubuntu-xenial-ovh-bhs1-0002284194 nova-compute[521]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 194, in _do_dispatch
  Sep 26 00:20:14.152036 ubuntu-xenial-ovh-bhs1-0002284194 nova-compute[521]: ERROR oslo_messaging.rpc.server     result = func(ctxt, **new_args)
  Sep 26 00:20:14.152270 ubuntu-xenial-ovh-bhs1-0002284194 nova-compute[521]: ERROR oslo_messaging.rpc.server   File "/opt/stack/new/nova/nova/exception_wrapper.py", line 79, in wrapped
  Sep 26 00:20:14.152483 ubuntu-xenial-ovh-bhs1-0002284194 nova-compute[521]: ERROR oslo_messaging.rpc.server     function_name, call_dict, binary, tb)
  Sep 26 00:20:14.152696 ubuntu-xenial-ovh-bhs1-0002284194 nova-compute[521]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 220, in __exit__
  Sep 26 00:20:14.152943 ubuntu-xenial-ovh-bhs1-0002284194 nova-compute[521]: ERROR oslo_messaging.rpc.server     self.force_reraise()
  Sep 26 00:20:14.153182 ubuntu-xenial-ovh-bhs1-0002284194 nova-compute[521]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 196, in force_reraise
  Sep 26 00:20:14.153408 ubuntu-xenial-ovh-bhs1-0002284194 nova-compute[521]: ERROR oslo_messaging.rpc.server     six.reraise(self.type_, self.value, self.tb)
  Sep 26 00:20:14.153612 ubuntu-xenial-ovh-bhs1-0002284194 nova-compute[521]: ERROR oslo_messaging.rpc.server   File "/opt/stack/new/nova/nova/exception_wrapper.py", line 69, in wrapped
  Sep 26 00:20:14.153843 ubuntu-xenial-ovh-bhs1-0002284194 nova-compute[521]: ERROR oslo_messaging.rpc.server     return f(self, context, *args, **kw)
  Sep 26 00:20:14.154056 ubuntu-xenial-ovh-bhs1-0002284194 nova-compute[521]: ERROR oslo_messaging.rpc.server   File "/opt/stack/new/nova/nova/compute/manager.py", line 216, in decorated_function
  Sep 26 00:20:14.154284 ubuntu-xenial-ovh-bhs1-0002284194 nova-compute[521]: ERROR oslo_messaging.rpc.server     kwargs['instance'], e, sys.exc_info())
  Sep 26 00:20:14.154518 ubuntu-xenial-ovh-bhs1-0002284194 nova-compute[521]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 220, in __exit__
  Sep 26 00:20:14.154737 ubuntu-xenial-ovh-bhs1-0002284194 nova-compute[521]: ERROR oslo_messaging.rpc.server     self.force_reraise()
  Sep 26 00:20:14.154975 ubuntu-xenial-ovh-bhs1-0002284194 nova-compute[521]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 196, in force_reraise
  Sep 26 00:20:14.155192 ubuntu-xenial-ovh-bhs1-0002284194 nova-compute[521]: ERROR oslo_messaging.rpc.server     six.reraise(self.type_, self.value, self.tb)
  Sep 26 00:20:14.155405 ubuntu-xenial-ovh-bhs1-0002284194 nova-compute[521]: ERROR oslo_messaging.rpc.server   File "/opt/stack/new/nova/nova/compute/manager.py", line 204, in decorated_function
  Sep 26 00:20:14.155635 ubuntu-xenial-ovh-bhs1-0002284194 nova-compute[521]: ERROR oslo_messaging.rpc.server     return function(self, context, *args, **kwargs)
  Sep 26 00:20:14.155860 ubuntu-xenial-ovh-bhs1-0002284194 nova-compute[521]: ERROR oslo_messaging.rpc.server   File "/opt/stack/new/nova/nova/compute/manager.py", line 4684, in remove_fixed_ip_from_instance
  Sep 26 00:20:14.156128 ubuntu-xenial-ovh-bhs1-0002284194 nova-compute[521]: ERROR oslo_messaging.rpc.server     address)
  Sep 26 00:20:14.156334 ubuntu-xenial-ovh-bhs1-0002284194 nova-compute[521]: ERROR oslo_messaging.rpc.server   File "/opt/stack/new/nova/nova/network/base_api.py", line 83, in wrapper
  Sep 26 00:20:14.156539 ubuntu-xenial-ovh-bhs1-0002284194 nova-compute[521]: ERROR oslo_messaging.rpc.server     res = f(self, context, *args, **kwargs)
  Sep 26 00:20:14.156766 ubuntu-xenial-ovh-bhs1-0002284194 nova-compute[521]: ERROR oslo_messaging.rpc.server   File "/opt/stack/new/nova/nova/network/neutronv2/api.py", line 1775, in remove_fixed_ip_from_instance
  Sep 26 00:20:14.156998 ubuntu-xenial-ovh-bhs1-0002284194 nova-compute[521]: ERROR oslo_messaging.rpc.server     instance_uuid=instance.uuid, ip=address)
  Sep 26 00:20:14.157199 ubuntu-xenial-ovh-bhs1-0002284194 nova-compute[521]: ERROR oslo_messaging.rpc.server FixedIpNotFoundForSpecificInstance: Instance 609c0565-d193-445c-be8f-4667eecbf2f4 doesn't have fixed IP '10.1.0.4'.
  Sep 26 00:20:14.157394 ubuntu-xenial-ovh-bhs1-0002284194 nova-compute[521]: ERROR oslo_messaging.rpc.server 

  It looks like right before this happens, the network is deallocated
  for the instance:

  http://logs.openstack.org/98/604898/2/check/nova-
  next/df58e8a/logs/screen-n-cpu.txt.gz#_Sep_26_00_20_14_065420

  Sep 26 00:20:14.065420 ubuntu-xenial-ovh-bhs1-0002284194 nova-
  compute[521]: INFO nova.compute.manager [-] [instance: 609c0565-d193
  -445c-be8f-4667eecbf2f4] Took 3.97 seconds to deallocate network for
  instance.

  So that probably explains why the instance doesn't have that fixed IP
  since it's already deallocated.

  The port is deallocated because it was deleted from neutron:

  Sep 26 00:20:13.932022 ubuntu-xenial-ovh-bhs1-0002284194 nova-compute[521]: DEBUG nova.compute.manager [None req-b72d5903-21ae-41a8-918a-e90fbc34c375 service nova] [instance: 609c0565-d193-445c-be8f-4667eecbf2f4] Received event network-vif-deleted-2ac9b1d5-ba6f-4cbb-a867-bdb61e176421 {{(pid=521) external_instance_event /opt/stack/new/nova/nova/compute/manager.py:8111}}
  Sep 26 00:20:13.932430 ubuntu-xenial-ovh-bhs1-0002284194 nova-compute[521]: INFO nova.compute.manager [None req-b72d5903-21ae-41a8-918a-e90fbc34c375 service nova] [instance: 609c0565-d193-445c-be8f-4667eecbf2f4] Neutron deleted interface 2ac9b1d5-ba6f-4cbb-a867-bdb61e176421; detaching it from the instance and deleting it from the info cache

To manage notifications about this bug go to:
https://bugs.launchpad.net/nova/+bug/1794558/+subscriptions


References