← Back to team overview

yahoo-eng-team team mailing list archive

[Bug 1774257] [NEW] neutron-openvswitch-agent RuntimeError: Switch connection timeout

 

Public bug reported:

In neutron-openvswitch-agent.log I see lot of timeout message.

  RuntimeError: Switch connection timeout

This timeout prevents sometime neutron-openvswitch-agent to be UP.
We are running Pike and we have ~1000 ports in Open vSwitch.

I'm able to run ovs-vsctl, ovs-ofctl, etc... commands which mean that
Open vSwitch (vswitchd+db) are working fine.

This is the full TRACE of neutron-openvswitch-agent log:

2018-05-30 19:22:42.353 7 WARNING ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: receive error: Connection reset by peer
2018-05-30 19:22:42.358 7 WARNING ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connection dropped (Connection reset by peer)
2018-05-30 19:24:17.626 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.openflow.native.ofswitch [req-3c335d47-9b3e-4f18-994b-afca7d7d70be - - - - -] Switch connection timeout: RuntimeError: Switch connection timeout
2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [req-3c335d47-9b3e-4f18-994b-afca7d7d70be - - - - -] Error while processing VIF ports: RuntimeError: Switch connection timeout
2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent Traceback (most recent call last):
2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent   File "/var/lib/kolla/venv/local/lib/python2.7/site-packages/neutron/plugins/ml2/drivers/openvswitch/agent/ovs_neutron_agent.py", line 2066, in rpc_loop
2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent     ofport_changed_ports = self.update_stale_ofport_rules()
2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent   File "/var/lib/kolla/venv/local/lib/python2.7/site-packages/osprofiler/profiler.py", line 153, in wrapper
2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent     return f(*args, **kwargs)
2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent   File "/var/lib/kolla/venv/local/lib/python2.7/site-packages/neutron/plugins/ml2/drivers/openvswitch/agent/ovs_neutron_agent.py", line 1210, in update_stale_ofport_rules
2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent     self.int_br.delete_arp_spoofing_protection(port=ofport)
2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent   File "/var/lib/kolla/venv/local/lib/python2.7/site-packages/neutron/plugins/ml2/drivers/openvswitch/agent/openflow/native/br_int.py", line 255, in delete_arp_spoofing_protection
2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent     match=match)
2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent   File "/var/lib/kolla/venv/local/lib/python2.7/site-packages/neutron/plugins/ml2/drivers/openvswitch/agent/openflow/native/ofswitch.py", line 111, in uninstall_flows
2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent     (dp, ofp, ofpp) = self._get_dp()
2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent   File "/var/lib/kolla/venv/local/lib/python2.7/site-packages/neutron/plugins/ml2/drivers/openvswitch/agent/openflow/native/ovs_bridge.py", line 67, in _get_dp
2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent     self._cached_dpid = new_dpid
2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent   File "/var/lib/kolla/venv/local/lib/python2.7/site-packages/oslo_utils/excutils.py", line 220, in __exit__
2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent     self.force_reraise()
2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent   File "/var/lib/kolla/venv/local/lib/python2.7/site-packages/oslo_utils/excutils.py", line 196, in force_reraise
2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent     six.reraise(self.type_, self.value, self.tb)
2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent   File "/var/lib/kolla/venv/local/lib/python2.7/site-packages/neutron/plugins/ml2/drivers/openvswitch/agent/openflow/native/ovs_bridge.py", line 50, in _get_dp
2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent     dp = self._get_dp_by_dpid(self._cached_dpid)
2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent   File "/var/lib/kolla/venv/local/lib/python2.7/site-packages/neutron/plugins/ml2/drivers/openvswitch/agent/openflow/native/ofswitch.py", line 69, in _get_dp_by_dpid
2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent     raise RuntimeError(m)
2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent RuntimeError: Switch connection timeout
2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent

Open vSwitch logs.

==> /var/log/kolla/openvswitch/ovs-vswitchd.log <==
2018-05-30T19:25:47.008Z|294667|rconn|INFO|br-tun<->tcp:127.0.0.1:6633: waiting 4 seconds before reconnect
2018-05-30T19:25:50.008Z|294668|rconn|INFO|br-ex<->tcp:127.0.0.1:6633: connecting...
2018-05-30T19:25:50.008Z|294669|rconn|INFO|br-int<->tcp:127.0.0.1:6633: connecting...
2018-05-30T19:25:51.008Z|294670|rconn|INFO|br-tun<->tcp:127.0.0.1:6633: connecting...
2018-05-30T19:25:54.008Z|294671|rconn|INFO|br-ex<->tcp:127.0.0.1:6633: connection timed out
2018-05-30T19:25:54.008Z|294672|rconn|INFO|br-ex<->tcp:127.0.0.1:6633: continuing to retry connections in the background but suppressing further logging
2018-05-30T19:25:54.008Z|294673|rconn|INFO|br-int<->tcp:127.0.0.1:6633: connection timed out
2018-05-30T19:25:54.008Z|294674|rconn|INFO|br-int<->tcp:127.0.0.1:6633: continuing to retry connections in the background but suppressing further logging
2018-05-30T19:25:55.008Z|294675|rconn|INFO|br-tun<->tcp:127.0.0.1:6633: connection timed out
2018-05-30T19:25:55.008Z|294676|rconn|INFO|br-tun<->tcp:127.0.0.1:6633: continuing to retry connections in the background but suppressing further logging

==> /var/log/kolla/openvswitch/ovsdb-server.log <==
2018-05-30T19:23:29.188Z|28929|reconnect|ERR|tcp:127.0.0.1:45468: no response to inactivity probe after 5 seconds, disconnecting
2018-05-30T19:23:57.341Z|28930|reconnect|ERR|tcp:127.0.0.1:54766: no response to inactivity probe after 5 seconds, disconnecting
2018-05-30T19:23:58.157Z|28931|reconnect|ERR|tcp:127.0.0.1:55294: no response to inactivity probe after 5 seconds, disconnecting
2018-05-30T19:24:27.132Z|28932|reconnect|ERR|tcp:127.0.0.1:37484: no response to inactivity probe after 5 seconds, disconnecting
2018-05-30T19:24:27.655Z|28933|reconnect|ERR|tcp:127.0.0.1:37614: no response to inactivity probe after 5 seconds, disconnecting
2018-05-30T19:24:56.926Z|28934|reconnect|ERR|tcp:127.0.0.1:47597: no response to inactivity probe after 5 seconds, disconnecting
2018-05-30T19:25:11.972Z|28935|reconnect|ERR|tcp:127.0.0.1:47582: no response to inactivity probe after 5 seconds, disconnecting
2018-05-30T19:25:19.962Z|28936|reconnect|ERR|tcp:127.0.0.1:54950: no response to inactivity probe after 5 seconds, disconnecting
2018-05-30T19:25:41.338Z|28937|reconnect|ERR|tcp:127.0.0.1:36394: no response to inactivity probe after 5 seconds, disconnecting
2018-05-30T19:25:49.850Z|28938|reconnect|ERR|tcp:127.0.0.1:39808: no response to inactivity probe after 5 seconds, disconnecting

** Affects: neutron
     Importance: Undecided
         Status: New

** Description changed:

  In neutron-openvswitch-agent.log I see lot of timeout message.
  
-   RuntimeError: Switch connection timeout
+   RuntimeError: Switch connection timeout
  
  This timeout prevents sometime neutron-openvswitch-agent to be UP.
- We are running Pike and we have ~1000 in Open vSwitch.
+ We are running Pike and we have ~1000 ports in Open vSwitch.
  
  I'm able to run ovs-vsctl, ovs-ofctl, etc... commands which mean that
  Open vSwitch (vswitchd+db) are working fine.
  
  This is the full TRACE of neutron-openvswitch-agent log:
  
  2018-05-30 19:22:42.353 7 WARNING ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: receive error: Connection reset by peer
  2018-05-30 19:22:42.358 7 WARNING ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connection dropped (Connection reset by peer)
  2018-05-30 19:24:17.626 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.openflow.native.ofswitch [req-3c335d47-9b3e-4f18-994b-afca7d7d70be - - - - -] Switch connection timeout: RuntimeError: Switch connection timeout
  2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [req-3c335d47-9b3e-4f18-994b-afca7d7d70be - - - - -] Error while processing VIF ports: RuntimeError: Switch connection timeout
  2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent Traceback (most recent call last):
  2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent   File "/var/lib/kolla/venv/local/lib/python2.7/site-packages/neutron/plugins/ml2/drivers/openvswitch/agent/ovs_neutron_agent.py", line 2066, in rpc_loop
  2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent     ofport_changed_ports = self.update_stale_ofport_rules()
  2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent   File "/var/lib/kolla/venv/local/lib/python2.7/site-packages/osprofiler/profiler.py", line 153, in wrapper
  2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent     return f(*args, **kwargs)
  2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent   File "/var/lib/kolla/venv/local/lib/python2.7/site-packages/neutron/plugins/ml2/drivers/openvswitch/agent/ovs_neutron_agent.py", line 1210, in update_stale_ofport_rules
  2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent     self.int_br.delete_arp_spoofing_protection(port=ofport)
  2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent   File "/var/lib/kolla/venv/local/lib/python2.7/site-packages/neutron/plugins/ml2/drivers/openvswitch/agent/openflow/native/br_int.py", line 255, in delete_arp_spoofing_protection
  2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent     match=match)
  2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent   File "/var/lib/kolla/venv/local/lib/python2.7/site-packages/neutron/plugins/ml2/drivers/openvswitch/agent/openflow/native/ofswitch.py", line 111, in uninstall_flows
  2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent     (dp, ofp, ofpp) = self._get_dp()
  2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent   File "/var/lib/kolla/venv/local/lib/python2.7/site-packages/neutron/plugins/ml2/drivers/openvswitch/agent/openflow/native/ovs_bridge.py", line 67, in _get_dp
  2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent     self._cached_dpid = new_dpid
  2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent   File "/var/lib/kolla/venv/local/lib/python2.7/site-packages/oslo_utils/excutils.py", line 220, in __exit__
  2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent     self.force_reraise()
  2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent   File "/var/lib/kolla/venv/local/lib/python2.7/site-packages/oslo_utils/excutils.py", line 196, in force_reraise
  2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent     six.reraise(self.type_, self.value, self.tb)
  2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent   File "/var/lib/kolla/venv/local/lib/python2.7/site-packages/neutron/plugins/ml2/drivers/openvswitch/agent/openflow/native/ovs_bridge.py", line 50, in _get_dp
  2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent     dp = self._get_dp_by_dpid(self._cached_dpid)
  2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent   File "/var/lib/kolla/venv/local/lib/python2.7/site-packages/neutron/plugins/ml2/drivers/openvswitch/agent/openflow/native/ofswitch.py", line 69, in _get_dp_by_dpid
  2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent     raise RuntimeError(m)
  2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent RuntimeError: Switch connection timeout
  2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent
  
  Open vSwitch logs.
  
  ==> /var/log/kolla/openvswitch/ovs-vswitchd.log <==
  2018-05-30T19:25:47.008Z|294667|rconn|INFO|br-tun<->tcp:127.0.0.1:6633: waiting 4 seconds before reconnect
  2018-05-30T19:25:50.008Z|294668|rconn|INFO|br-ex<->tcp:127.0.0.1:6633: connecting...
  2018-05-30T19:25:50.008Z|294669|rconn|INFO|br-int<->tcp:127.0.0.1:6633: connecting...
  2018-05-30T19:25:51.008Z|294670|rconn|INFO|br-tun<->tcp:127.0.0.1:6633: connecting...
  2018-05-30T19:25:54.008Z|294671|rconn|INFO|br-ex<->tcp:127.0.0.1:6633: connection timed out
  2018-05-30T19:25:54.008Z|294672|rconn|INFO|br-ex<->tcp:127.0.0.1:6633: continuing to retry connections in the background but suppressing further logging
  2018-05-30T19:25:54.008Z|294673|rconn|INFO|br-int<->tcp:127.0.0.1:6633: connection timed out
  2018-05-30T19:25:54.008Z|294674|rconn|INFO|br-int<->tcp:127.0.0.1:6633: continuing to retry connections in the background but suppressing further logging
  2018-05-30T19:25:55.008Z|294675|rconn|INFO|br-tun<->tcp:127.0.0.1:6633: connection timed out
  2018-05-30T19:25:55.008Z|294676|rconn|INFO|br-tun<->tcp:127.0.0.1:6633: continuing to retry connections in the background but suppressing further logging
  
  ==> /var/log/kolla/openvswitch/ovsdb-server.log <==
  2018-05-30T19:23:29.188Z|28929|reconnect|ERR|tcp:127.0.0.1:45468: no response to inactivity probe after 5 seconds, disconnecting
  2018-05-30T19:23:57.341Z|28930|reconnect|ERR|tcp:127.0.0.1:54766: no response to inactivity probe after 5 seconds, disconnecting
  2018-05-30T19:23:58.157Z|28931|reconnect|ERR|tcp:127.0.0.1:55294: no response to inactivity probe after 5 seconds, disconnecting
  2018-05-30T19:24:27.132Z|28932|reconnect|ERR|tcp:127.0.0.1:37484: no response to inactivity probe after 5 seconds, disconnecting
  2018-05-30T19:24:27.655Z|28933|reconnect|ERR|tcp:127.0.0.1:37614: no response to inactivity probe after 5 seconds, disconnecting
  2018-05-30T19:24:56.926Z|28934|reconnect|ERR|tcp:127.0.0.1:47597: no response to inactivity probe after 5 seconds, disconnecting
  2018-05-30T19:25:11.972Z|28935|reconnect|ERR|tcp:127.0.0.1:47582: no response to inactivity probe after 5 seconds, disconnecting
  2018-05-30T19:25:19.962Z|28936|reconnect|ERR|tcp:127.0.0.1:54950: no response to inactivity probe after 5 seconds, disconnecting
  2018-05-30T19:25:41.338Z|28937|reconnect|ERR|tcp:127.0.0.1:36394: no response to inactivity probe after 5 seconds, disconnecting
  2018-05-30T19:25:49.850Z|28938|reconnect|ERR|tcp:127.0.0.1:39808: no response to inactivity probe after 5 seconds, disconnecting

-- 
You received this bug notification because you are a member of Yahoo!
Engineering Team, which is subscribed to neutron.
https://bugs.launchpad.net/bugs/1774257

Title:
  neutron-openvswitch-agent RuntimeError: Switch connection timeout

Status in neutron:
  New

Bug description:
  In neutron-openvswitch-agent.log I see lot of timeout message.

    RuntimeError: Switch connection timeout

  This timeout prevents sometime neutron-openvswitch-agent to be UP.
  We are running Pike and we have ~1000 ports in Open vSwitch.

  I'm able to run ovs-vsctl, ovs-ofctl, etc... commands which mean that
  Open vSwitch (vswitchd+db) are working fine.

  This is the full TRACE of neutron-openvswitch-agent log:

  2018-05-30 19:22:42.353 7 WARNING ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: receive error: Connection reset by peer
  2018-05-30 19:22:42.358 7 WARNING ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connection dropped (Connection reset by peer)
  2018-05-30 19:24:17.626 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.openflow.native.ofswitch [req-3c335d47-9b3e-4f18-994b-afca7d7d70be - - - - -] Switch connection timeout: RuntimeError: Switch connection timeout
  2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [req-3c335d47-9b3e-4f18-994b-afca7d7d70be - - - - -] Error while processing VIF ports: RuntimeError: Switch connection timeout
  2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent Traceback (most recent call last):
  2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent   File "/var/lib/kolla/venv/local/lib/python2.7/site-packages/neutron/plugins/ml2/drivers/openvswitch/agent/ovs_neutron_agent.py", line 2066, in rpc_loop
  2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent     ofport_changed_ports = self.update_stale_ofport_rules()
  2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent   File "/var/lib/kolla/venv/local/lib/python2.7/site-packages/osprofiler/profiler.py", line 153, in wrapper
  2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent     return f(*args, **kwargs)
  2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent   File "/var/lib/kolla/venv/local/lib/python2.7/site-packages/neutron/plugins/ml2/drivers/openvswitch/agent/ovs_neutron_agent.py", line 1210, in update_stale_ofport_rules
  2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent     self.int_br.delete_arp_spoofing_protection(port=ofport)
  2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent   File "/var/lib/kolla/venv/local/lib/python2.7/site-packages/neutron/plugins/ml2/drivers/openvswitch/agent/openflow/native/br_int.py", line 255, in delete_arp_spoofing_protection
  2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent     match=match)
  2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent   File "/var/lib/kolla/venv/local/lib/python2.7/site-packages/neutron/plugins/ml2/drivers/openvswitch/agent/openflow/native/ofswitch.py", line 111, in uninstall_flows
  2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent     (dp, ofp, ofpp) = self._get_dp()
  2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent   File "/var/lib/kolla/venv/local/lib/python2.7/site-packages/neutron/plugins/ml2/drivers/openvswitch/agent/openflow/native/ovs_bridge.py", line 67, in _get_dp
  2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent     self._cached_dpid = new_dpid
  2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent   File "/var/lib/kolla/venv/local/lib/python2.7/site-packages/oslo_utils/excutils.py", line 220, in __exit__
  2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent     self.force_reraise()
  2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent   File "/var/lib/kolla/venv/local/lib/python2.7/site-packages/oslo_utils/excutils.py", line 196, in force_reraise
  2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent     six.reraise(self.type_, self.value, self.tb)
  2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent   File "/var/lib/kolla/venv/local/lib/python2.7/site-packages/neutron/plugins/ml2/drivers/openvswitch/agent/openflow/native/ovs_bridge.py", line 50, in _get_dp
  2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent     dp = self._get_dp_by_dpid(self._cached_dpid)
  2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent   File "/var/lib/kolla/venv/local/lib/python2.7/site-packages/neutron/plugins/ml2/drivers/openvswitch/agent/openflow/native/ofswitch.py", line 69, in _get_dp_by_dpid
  2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent     raise RuntimeError(m)
  2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent RuntimeError: Switch connection timeout
  2018-05-30 19:24:17.628 7 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent

  Open vSwitch logs.

  ==> /var/log/kolla/openvswitch/ovs-vswitchd.log <==
  2018-05-30T19:25:47.008Z|294667|rconn|INFO|br-tun<->tcp:127.0.0.1:6633: waiting 4 seconds before reconnect
  2018-05-30T19:25:50.008Z|294668|rconn|INFO|br-ex<->tcp:127.0.0.1:6633: connecting...
  2018-05-30T19:25:50.008Z|294669|rconn|INFO|br-int<->tcp:127.0.0.1:6633: connecting...
  2018-05-30T19:25:51.008Z|294670|rconn|INFO|br-tun<->tcp:127.0.0.1:6633: connecting...
  2018-05-30T19:25:54.008Z|294671|rconn|INFO|br-ex<->tcp:127.0.0.1:6633: connection timed out
  2018-05-30T19:25:54.008Z|294672|rconn|INFO|br-ex<->tcp:127.0.0.1:6633: continuing to retry connections in the background but suppressing further logging
  2018-05-30T19:25:54.008Z|294673|rconn|INFO|br-int<->tcp:127.0.0.1:6633: connection timed out
  2018-05-30T19:25:54.008Z|294674|rconn|INFO|br-int<->tcp:127.0.0.1:6633: continuing to retry connections in the background but suppressing further logging
  2018-05-30T19:25:55.008Z|294675|rconn|INFO|br-tun<->tcp:127.0.0.1:6633: connection timed out
  2018-05-30T19:25:55.008Z|294676|rconn|INFO|br-tun<->tcp:127.0.0.1:6633: continuing to retry connections in the background but suppressing further logging

  ==> /var/log/kolla/openvswitch/ovsdb-server.log <==
  2018-05-30T19:23:29.188Z|28929|reconnect|ERR|tcp:127.0.0.1:45468: no response to inactivity probe after 5 seconds, disconnecting
  2018-05-30T19:23:57.341Z|28930|reconnect|ERR|tcp:127.0.0.1:54766: no response to inactivity probe after 5 seconds, disconnecting
  2018-05-30T19:23:58.157Z|28931|reconnect|ERR|tcp:127.0.0.1:55294: no response to inactivity probe after 5 seconds, disconnecting
  2018-05-30T19:24:27.132Z|28932|reconnect|ERR|tcp:127.0.0.1:37484: no response to inactivity probe after 5 seconds, disconnecting
  2018-05-30T19:24:27.655Z|28933|reconnect|ERR|tcp:127.0.0.1:37614: no response to inactivity probe after 5 seconds, disconnecting
  2018-05-30T19:24:56.926Z|28934|reconnect|ERR|tcp:127.0.0.1:47597: no response to inactivity probe after 5 seconds, disconnecting
  2018-05-30T19:25:11.972Z|28935|reconnect|ERR|tcp:127.0.0.1:47582: no response to inactivity probe after 5 seconds, disconnecting
  2018-05-30T19:25:19.962Z|28936|reconnect|ERR|tcp:127.0.0.1:54950: no response to inactivity probe after 5 seconds, disconnecting
  2018-05-30T19:25:41.338Z|28937|reconnect|ERR|tcp:127.0.0.1:36394: no response to inactivity probe after 5 seconds, disconnecting
  2018-05-30T19:25:49.850Z|28938|reconnect|ERR|tcp:127.0.0.1:39808: no response to inactivity probe after 5 seconds, disconnecting

To manage notifications about this bug go to:
https://bugs.launchpad.net/neutron/+bug/1774257/+subscriptions


Follow ups