openflow connection is sometimes lost and the subsequent openflow command fails. Retrying should save such cases.
example:
http://logs.openstack.org/98/436798/25/check/gate-tempest-dsvm-neutron-full-centos-7-nv/3d5e54b/logs/screen-neutron-agent.txt.gz#_2017-03-13_06_54_30_891
2017-03-13 06:54:30.896 16978 ERROR OfctlService [-] unknown dpid 143366513125697 2017-03-13 06:54:30.897 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.openflow.native.ofswitch [req-a1036712-2598-48a8-8a38-8e7eacc0ed8f None None] ofctl request version=None,msg_type=None,msg_len=None,xid=None,OFPFlowStatsRequest(cookie=0,cookie_mask=0,flags=0,match=OFPMatch(oxm_fields={}),out_group=4294967295,out_port=4294967295,table_id=23,type=1) error Datapath Invalid 143366513125697 2017-03-13 06:54:30.898 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.openflow.native.br_int [req-a1036712-2598-48a8-8a38-8e7eacc0ed8f None None] Failed to communicate with the switch 2017-03-13 06:54:30.898 16978 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.openflow.native.br_int Traceback (most recent call last): 2017-03-13 06:54:30.898 16978 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.openflow.native.br_int File "/opt/stack/new/neutron/neutron/plugins/ml2/drivers/openvswitch/agent/openflow/native/br_int.py", line 52, in check_canary_table 2017-03-13 06:54:30.898 16978 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.openflow.native.br_int flows = self.dump_flows(constants.CANARY_TABLE) 2017-03-13 06:54:30.898 16978 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.openflow.native.br_int File "/opt/stack/new/neutron/neutron/plugins/ml2/drivers/openvswitch/agent/openflow/native/ofswitch.py", line 131, in dump_flows 2017-03-13 06:54:30.898 16978 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.openflow.native.br_int reply_multi=True) 2017-03-13 06:54:30.898 16978 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.openflow.native.br_int File "/opt/stack/new/neutron/neutron/plugins/ml2/drivers/openvswitch/agent/openflow/native/ofswitch.py", line 79, in _send_msg 2017-03-13 06:54:30.898 16978 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.openflow.native.br_int raise RuntimeError(m) 2017-03-13 06:54:30.898 16978 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.openflow.native.br_int RuntimeError: ofctl request version=None,msg_type=None,msg_len=None,xid=None,OFPFlowStatsRequest(cookie=0,cookie_mask=0,flags=0,match=OFPMatch(oxm_fields={}),out_group=4294967295,out_port=4294967295,table_id=23,type=1) error Datapath Invalid 143366513125697 2017-03-13 06:54:30.898 16978 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.openflow.native.br_int 2017-03-13 06:54:30.907 WARNING neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [req-a1036712-2598-48a8-8a38-8e7eacc0ed8f None None] OVS is dead. OVSNeutronAgent will keep running and checking OVS status periodically.
openflow connection is sometimes lost and the subsequent openflow command fails.
Retrying should save such cases.
example:
http:// logs.openstack. org/98/ 436798/ 25/check/ gate-tempest- dsvm-neutron- full-centos- 7-nv/3d5e54b/ logs/screen- neutron- agent.txt. gz#_2017- 03-13_06_ 54_30_891
2017-03-13 06:54:30.896 16978 ERROR OfctlService [-] unknown dpid 143366513125697 plugins. ml2.drivers. openvswitch. agent.openflow. native. ofswitch [req-a1036712- 2598-48a8- 8a38-8e7eacc0ed 8f None None] ofctl request version= None,msg_ type=None, msg_len= None,xid= None,OFPFlowSta tsRequest( cookie= 0,cookie_ mask=0, flags=0, match=OFPMatch( oxm_fields= {}),out_ group=429496729 5,out_port= 4294967295, table_id= 23,type= 1) error Datapath Invalid 143366513125697 plugins. ml2.drivers. openvswitch. agent.openflow. native. br_int [req-a1036712- 2598-48a8- 8a38-8e7eacc0ed 8f None None] Failed to communicate with the switch plugins. ml2.drivers. openvswitch. agent.openflow. native. br_int Traceback (most recent call last): plugins. ml2.drivers. openvswitch. agent.openflow. native. br_int File "/opt/stack/ new/neutron/ neutron/ plugins/ ml2/drivers/ openvswitch/ agent/openflow/ native/ br_int. py", line 52, in check_canary_table plugins. ml2.drivers. openvswitch. agent.openflow. native. br_int flows = self.dump_ flows(constants .CANARY_ TABLE) plugins. ml2.drivers. openvswitch. agent.openflow. native. br_int File "/opt/stack/ new/neutron/ neutron/ plugins/ ml2/drivers/ openvswitch/ agent/openflow/ native/ ofswitch. py", line 131, in dump_flows plugins. ml2.drivers. openvswitch. agent.openflow. native. br_int reply_multi=True) plugins. ml2.drivers. openvswitch. agent.openflow. native. br_int File "/opt/stack/ new/neutron/ neutron/ plugins/ ml2/drivers/ openvswitch/ agent/openflow/ native/ ofswitch. py", line 79, in _send_msg plugins. ml2.drivers. openvswitch. agent.openflow. native. br_int raise RuntimeError(m) plugins. ml2.drivers. openvswitch. agent.openflow. native. br_int RuntimeError: ofctl request version= None,msg_ type=None, msg_len= None,xid= None,OFPFlowSta tsRequest( cookie= 0,cookie_ mask=0, flags=0, match=OFPMatch( oxm_fields= {}),out_ group=429496729 5,out_port= 4294967295, table_id= 23,type= 1) error Datapath Invalid 143366513125697 plugins. ml2.drivers. openvswitch. agent.openflow. native. br_int plugins. ml2.drivers. openvswitch. agent.ovs_ neutron_ agent [req-a1036712- 2598-48a8- 8a38-8e7eacc0ed 8f None None] OVS is dead. OVSNeutronAgent will keep running and checking OVS status periodically.
2017-03-13 06:54:30.897 ERROR neutron.
2017-03-13 06:54:30.898 ERROR neutron.
2017-03-13 06:54:30.898 16978 ERROR neutron.
2017-03-13 06:54:30.898 16978 ERROR neutron.
2017-03-13 06:54:30.898 16978 ERROR neutron.
2017-03-13 06:54:30.898 16978 ERROR neutron.
2017-03-13 06:54:30.898 16978 ERROR neutron.
2017-03-13 06:54:30.898 16978 ERROR neutron.
2017-03-13 06:54:30.898 16978 ERROR neutron.
2017-03-13 06:54:30.898 16978 ERROR neutron.
2017-03-13 06:54:30.898 16978 ERROR neutron.
2017-03-13 06:54:30.907 WARNING neutron.