← Back to team overview

yahoo-eng-team team mailing list archive

[Bug 1394358] [NEW] check-tempest-dsvm-neutron-aiopcpu causes neutron failure

 

Public bug reported:

The nova error is as follows:

2014-11-19 05:31:24.097 ERROR nova.compute.manager [req-6c02bdd7-448a-4c68-9296-70c41f28b53e ServerActionsTestXML-263356571 ServerActionsTestXML-1715780796] [instance: e6cba227-74ea-4592-8978-2646ca26e35f] Setting instance vm_state to ERROR
2014-11-19 05:31:24.097 13050 TRACE nova.compute.manager [instance: e6cba227-74ea-4592-8978-2646ca26e35f] Traceback (most recent call last):
2014-11-19 05:31:24.097 13050 TRACE nova.compute.manager [instance: e6cba227-74ea-4592-8978-2646ca26e35f]   File "/opt/stack/new/nova/nova/compute/manager.py", line 6109, in _error_out_instance_on_exception
2014-11-19 05:31:24.097 13050 TRACE nova.compute.manager [instance: e6cba227-74ea-4592-8978-2646ca26e35f]     yield
2014-11-19 05:31:24.097 13050 TRACE nova.compute.manager [instance: e6cba227-74ea-4592-8978-2646ca26e35f]   File "/opt/stack/new/nova/nova/compute/manager.py", line 3569, in finish_revert_resize
2014-11-19 05:31:24.097 13050 TRACE nova.compute.manager [instance: e6cba227-74ea-4592-8978-2646ca26e35f]     block_device_info, power_on)
2014-11-19 05:31:24.097 13050 TRACE nova.compute.manager [instance: e6cba227-74ea-4592-8978-2646ca26e35f]   File "/opt/stack/new/nova/nova/virt/libvirt/driver.py", line 6176, in finish_revert_migration
2014-11-19 05:31:24.097 13050 TRACE nova.compute.manager [instance: e6cba227-74ea-4592-8978-2646ca26e35f]     block_device_info, power_on)
2014-11-19 05:31:24.097 13050 TRACE nova.compute.manager [instance: e6cba227-74ea-4592-8978-2646ca26e35f]   File "/opt/stack/new/nova/nova/virt/libvirt/driver.py", line 4500, in _create_domain_and_network
2014-11-19 05:31:24.097 13050 TRACE nova.compute.manager [instance: e6cba227-74ea-4592-8978-2646ca26e35f]     raise exception.VirtualInterfaceCreateException()
2014-11-19 05:31:24.097 13050 TRACE nova.compute.manager [instance: e6cba227-74ea-4592-8978-2646ca26e35f] VirtualInterfaceCreateException: Virtual Interface creation failed
2014-11-19 05:31:24.097 13050 TRACE nova.compute.manager [instance: e6cba227-74ea-4592-8978-2646ca26e35f] 
2014-11-19 05:31:24.400 ERROR oslo.messaging.rpc.dispatcher [req-6c02bdd7-448a-4c68-9296-70c41f28b53e ServerActionsTestXML-263356571 ServerActionsTestXML-1715780796] Exception during message handling: Virtual Interface creation failed

The neutron agent error is as follows:

2014-11-19 05:13:50.339 5051 ERROR neutron.plugins.openvswitch.agent.ovs_neutron_agent [req-3667b9d8-1b2a-4a4a-b6f1-f2b1810801a1 None] Error while processing VIF ports
2014-11-19 05:13:50.339 5051 TRACE neutron.plugins.openvswitch.agent.ovs_neutron_agent Traceback (most recent call last):
2014-11-19 05:13:50.339 5051 TRACE neutron.plugins.openvswitch.agent.ovs_neutron_agent   File "/opt/stack/new/neutron/neutron/plugins/openvswitch/agent/ovs_neutron_agent.py", line 1421, in rpc_loop
2014-11-19 05:13:50.339 5051 TRACE neutron.plugins.openvswitch.agent.ovs_neutron_agent     ovs_restarted)
2014-11-19 05:13:50.339 5051 TRACE neutron.plugins.openvswitch.agent.ovs_neutron_agent   File "/opt/stack/new/neutron/neutron/plugins/openvswitch/agent/ovs_neutron_agent.py", line 1229, in process_network_ports
2014-11-19 05:13:50.339 5051 TRACE neutron.plugins.openvswitch.agent.ovs_neutron_agent     devices_added_updated, ovs_restarted)
2014-11-19 05:13:50.339 5051 TRACE neutron.plugins.openvswitch.agent.ovs_neutron_agent   File "/opt/stack/new/neutron/neutron/plugins/openvswitch/agent/ovs_neutron_agent.py", line 1125, in treat_devices_added_or_updated
2014-11-19 05:13:50.339 5051 TRACE neutron.plugins.openvswitch.agent.ovs_neutron_agent     ovs_restarted)
2014-11-19 05:13:50.339 5051 TRACE neutron.plugins.openvswitch.agent.ovs_neutron_agent   File "/opt/stack/new/neutron/neutron/plugins/openvswitch/agent/ovs_neutron_agent.py", line 1024, in treat_vif_port
2014-11-19 05:13:50.339 5051 TRACE neutron.plugins.openvswitch.agent.ovs_neutron_agent     fixed_ips, device_owner, ovs_restarted)
2014-11-19 05:13:50.339 5051 TRACE neutron.plugins.openvswitch.agent.ovs_neutron_agent   File "/opt/stack/new/neutron/neutron/plugins/openvswitch/agent/ovs_neutron_agent.py", line 637, in port_bound
2014-11-19 05:13:50.339 5051 TRACE neutron.plugins.openvswitch.agent.ovs_neutron_agent     lvm = self.local_vlan_map[net_uuid]
2014-11-19 05:13:50.339 5051 TRACE neutron.plugins.openvswitch.agent.ovs_neutron_agent KeyError: u'b13d216f-4713-4b1d-8353-26423dbf5bdc'

It appears that network b13d216f-4713-4b1d-8353-26423dbf5bdc is deleted
on the server while the agent is processing ports for this network.

See the logs here:

http://logs.openstack.org/39/134639/16/experimental/check-tempest-dsvm-
neutron-aiopcpu/7aac738/

** Affects: neutron
     Importance: High
         Status: New

** Changed in: neutron
   Importance: Undecided => High

-- 
You received this bug notification because you are a member of Yahoo!
Engineering Team, which is subscribed to neutron.
https://bugs.launchpad.net/bugs/1394358

Title:
  check-tempest-dsvm-neutron-aiopcpu causes neutron failure

Status in OpenStack Neutron (virtual network service):
  New

Bug description:
  The nova error is as follows:

  2014-11-19 05:31:24.097 ERROR nova.compute.manager [req-6c02bdd7-448a-4c68-9296-70c41f28b53e ServerActionsTestXML-263356571 ServerActionsTestXML-1715780796] [instance: e6cba227-74ea-4592-8978-2646ca26e35f] Setting instance vm_state to ERROR
  2014-11-19 05:31:24.097 13050 TRACE nova.compute.manager [instance: e6cba227-74ea-4592-8978-2646ca26e35f] Traceback (most recent call last):
  2014-11-19 05:31:24.097 13050 TRACE nova.compute.manager [instance: e6cba227-74ea-4592-8978-2646ca26e35f]   File "/opt/stack/new/nova/nova/compute/manager.py", line 6109, in _error_out_instance_on_exception
  2014-11-19 05:31:24.097 13050 TRACE nova.compute.manager [instance: e6cba227-74ea-4592-8978-2646ca26e35f]     yield
  2014-11-19 05:31:24.097 13050 TRACE nova.compute.manager [instance: e6cba227-74ea-4592-8978-2646ca26e35f]   File "/opt/stack/new/nova/nova/compute/manager.py", line 3569, in finish_revert_resize
  2014-11-19 05:31:24.097 13050 TRACE nova.compute.manager [instance: e6cba227-74ea-4592-8978-2646ca26e35f]     block_device_info, power_on)
  2014-11-19 05:31:24.097 13050 TRACE nova.compute.manager [instance: e6cba227-74ea-4592-8978-2646ca26e35f]   File "/opt/stack/new/nova/nova/virt/libvirt/driver.py", line 6176, in finish_revert_migration
  2014-11-19 05:31:24.097 13050 TRACE nova.compute.manager [instance: e6cba227-74ea-4592-8978-2646ca26e35f]     block_device_info, power_on)
  2014-11-19 05:31:24.097 13050 TRACE nova.compute.manager [instance: e6cba227-74ea-4592-8978-2646ca26e35f]   File "/opt/stack/new/nova/nova/virt/libvirt/driver.py", line 4500, in _create_domain_and_network
  2014-11-19 05:31:24.097 13050 TRACE nova.compute.manager [instance: e6cba227-74ea-4592-8978-2646ca26e35f]     raise exception.VirtualInterfaceCreateException()
  2014-11-19 05:31:24.097 13050 TRACE nova.compute.manager [instance: e6cba227-74ea-4592-8978-2646ca26e35f] VirtualInterfaceCreateException: Virtual Interface creation failed
  2014-11-19 05:31:24.097 13050 TRACE nova.compute.manager [instance: e6cba227-74ea-4592-8978-2646ca26e35f] 
  2014-11-19 05:31:24.400 ERROR oslo.messaging.rpc.dispatcher [req-6c02bdd7-448a-4c68-9296-70c41f28b53e ServerActionsTestXML-263356571 ServerActionsTestXML-1715780796] Exception during message handling: Virtual Interface creation failed

  The neutron agent error is as follows:

  2014-11-19 05:13:50.339 5051 ERROR neutron.plugins.openvswitch.agent.ovs_neutron_agent [req-3667b9d8-1b2a-4a4a-b6f1-f2b1810801a1 None] Error while processing VIF ports
  2014-11-19 05:13:50.339 5051 TRACE neutron.plugins.openvswitch.agent.ovs_neutron_agent Traceback (most recent call last):
  2014-11-19 05:13:50.339 5051 TRACE neutron.plugins.openvswitch.agent.ovs_neutron_agent   File "/opt/stack/new/neutron/neutron/plugins/openvswitch/agent/ovs_neutron_agent.py", line 1421, in rpc_loop
  2014-11-19 05:13:50.339 5051 TRACE neutron.plugins.openvswitch.agent.ovs_neutron_agent     ovs_restarted)
  2014-11-19 05:13:50.339 5051 TRACE neutron.plugins.openvswitch.agent.ovs_neutron_agent   File "/opt/stack/new/neutron/neutron/plugins/openvswitch/agent/ovs_neutron_agent.py", line 1229, in process_network_ports
  2014-11-19 05:13:50.339 5051 TRACE neutron.plugins.openvswitch.agent.ovs_neutron_agent     devices_added_updated, ovs_restarted)
  2014-11-19 05:13:50.339 5051 TRACE neutron.plugins.openvswitch.agent.ovs_neutron_agent   File "/opt/stack/new/neutron/neutron/plugins/openvswitch/agent/ovs_neutron_agent.py", line 1125, in treat_devices_added_or_updated
  2014-11-19 05:13:50.339 5051 TRACE neutron.plugins.openvswitch.agent.ovs_neutron_agent     ovs_restarted)
  2014-11-19 05:13:50.339 5051 TRACE neutron.plugins.openvswitch.agent.ovs_neutron_agent   File "/opt/stack/new/neutron/neutron/plugins/openvswitch/agent/ovs_neutron_agent.py", line 1024, in treat_vif_port
  2014-11-19 05:13:50.339 5051 TRACE neutron.plugins.openvswitch.agent.ovs_neutron_agent     fixed_ips, device_owner, ovs_restarted)
  2014-11-19 05:13:50.339 5051 TRACE neutron.plugins.openvswitch.agent.ovs_neutron_agent   File "/opt/stack/new/neutron/neutron/plugins/openvswitch/agent/ovs_neutron_agent.py", line 637, in port_bound
  2014-11-19 05:13:50.339 5051 TRACE neutron.plugins.openvswitch.agent.ovs_neutron_agent     lvm = self.local_vlan_map[net_uuid]
  2014-11-19 05:13:50.339 5051 TRACE neutron.plugins.openvswitch.agent.ovs_neutron_agent KeyError: u'b13d216f-4713-4b1d-8353-26423dbf5bdc'

  It appears that network b13d216f-4713-4b1d-8353-26423dbf5bdc is
  deleted on the server while the agent is processing ports for this
  network.

  See the logs here:

  http://logs.openstack.org/39/134639/16/experimental/check-tempest-
  dsvm-neutron-aiopcpu/7aac738/

To manage notifications about this bug go to:
https://bugs.launchpad.net/neutron/+bug/1394358/+subscriptions


Follow ups

References