2020-09-15 18:49:04 |
Terry Wilson |
bug |
|
|
added bug |
2020-09-28 19:27:21 |
OpenStack Infra |
ovsdbapp: status |
In Progress |
Fix Released |
|
2020-10-27 16:04:12 |
OpenStack Infra |
tags |
|
in-stable-victoria |
|
2020-10-30 11:45:50 |
OpenStack Infra |
tags |
in-stable-victoria |
in-stable-train in-stable-victoria |
|
2020-10-30 14:52:11 |
OpenStack Infra |
tags |
in-stable-train in-stable-victoria |
in-stable-train in-stable-ussuri in-stable-victoria |
|
2021-04-08 06:52:02 |
Hemanth Nakkina |
bug task added |
|
python-ovsdbapp (Ubuntu) |
|
2021-04-08 06:52:23 |
Hemanth Nakkina |
nominated for series |
|
Ubuntu Hirsute |
|
2021-04-08 06:52:23 |
Hemanth Nakkina |
bug task added |
|
python-ovsdbapp (Ubuntu Hirsute) |
|
2021-04-08 06:52:23 |
Hemanth Nakkina |
nominated for series |
|
Ubuntu Groovy |
|
2021-04-08 06:52:23 |
Hemanth Nakkina |
bug task added |
|
python-ovsdbapp (Ubuntu Groovy) |
|
2021-04-08 06:52:23 |
Hemanth Nakkina |
nominated for series |
|
Ubuntu Focal |
|
2021-04-08 06:52:23 |
Hemanth Nakkina |
bug task added |
|
python-ovsdbapp (Ubuntu Focal) |
|
2021-04-08 06:53:15 |
Hemanth Nakkina |
description |
If ovsdb-server is down for a while and we are connecting via SSL, python-ovs will raise
OpenSSL.SSL.SysCallError: (111, 'ECONNREFUSED')
instead of just returning an error type. If this goes on for a bit, then the Connection thread will exit and be unrecoverable without restarting neutron-server. |
If ovsdb-server is down for a while and we are connecting via SSL, python-ovs will raise
OpenSSL.SSL.SysCallError: (111, 'ECONNREFUSED')
instead of just returning an error type. If this goes on for a bit, then the Connection thread will exit and be unrecoverable without restarting neutron-server.
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
SRU:
[Impact]
Any intermittent connection issues between neutron-server and ovsdb nb/sb resulted in neutron-server not handling any more ovsdb transactions due to improper exception handling during reconnections. This further creates failures in post commit updates of resources and results in neutron/ovn db inconsistencies.
This fix catches the exceptions and retries to connect to ovsdb.
[Test plan]
* Deploy bionic-ussuri with neutron-server and ovn-central as HA using juju charms.
* Launch few instances and check if instances are in active state
* Simulated the network communication issues by modifying iptables related to ports 6641 6643 6644 16642
- On ovn-central/0, Dropping packets from ovn-central/2 and neutron-server/2
- On ovn-central/1, Dropping packets from ovn-central/2 and neutron-server/2
- On ovn-central/2, Dropping packets from ovn-central/0, ovn-central/1, neutron-server/0, neutron-server/1
DROP_PKTS_FROM_OVN_CENTRAL=
DROP_PKTS_FROM_NEUTRON_SERVER=
for ip in $DROP_PKTS_FROM_OVN_CENTRAL; do for port in 6641 6643 6644 16642; do iptables -I ufw-before-input 1 -s $ip -p tcp --dport $port -j REJECT; done; done
for ip in $DROP_PKTS_FROM_NEUTRON_SERVER; do for port in 6641 16642; do iptables -I ufw-before-input 1 -s $ip -p tcp --dport $port -j REJECT; done; done
* After a minute, drop the new REJECT rules added.
* Launch around 5 new VMs (5 to ensure some post creations to be landed on neutron-server/2) and look for Timeout Exceptions on neutron-server/2
If there are any Timeout exceptions, the neutron-server ovsdb connections are stale and not handling any more ovsdb transactions.
No Timeout exceptions and any port status updates from ovsdb implies neutron-server is successful in reconnection and started handling updates.
[Where problems could occur]
The fix passed the upstream zuul gates (tempest tests etc) and the patch just adds reconnection tries to ovsdbapp. So not expecting to introduce any regressions. |
|
2021-04-08 06:53:24 |
Hemanth Nakkina |
tags |
in-stable-train in-stable-ussuri in-stable-victoria |
in-stable-train in-stable-ussuri in-stable-victoria sts |
|
2021-04-08 06:53:52 |
Hemanth Nakkina |
python-ovsdbapp (Ubuntu Hirsute): status |
New |
Fix Released |
|
2021-04-08 07:08:18 |
Hemanth Nakkina |
attachment added |
|
Debdiff for groovy https://bugs.launchpad.net/ubuntu/+source/python-ovsdbapp/+bug/1895727/+attachment/5485479/+files/lp1895727_groovy.debdiff |
|
2021-04-08 07:08:43 |
Hemanth Nakkina |
attachment added |
|
Debdiff for focal https://bugs.launchpad.net/ubuntu/+source/python-ovsdbapp/+bug/1895727/+attachment/5485480/+files/lp1895727_focal.debdiff |
|
2021-04-08 07:11:19 |
Launchpad Janitor |
python-ovsdbapp (Ubuntu Focal): status |
New |
Confirmed |
|
2021-04-08 07:11:19 |
Launchpad Janitor |
python-ovsdbapp (Ubuntu Groovy): status |
New |
Confirmed |
|
2021-04-12 14:03:50 |
Edward Hope-Morley |
bug task added |
|
cloud-archive |
|
2021-04-12 14:04:00 |
Edward Hope-Morley |
nominated for series |
|
cloud-archive/victoria |
|
2021-04-12 14:04:00 |
Edward Hope-Morley |
bug task added |
|
cloud-archive/victoria |
|
2021-04-12 14:04:00 |
Edward Hope-Morley |
nominated for series |
|
cloud-archive/ussuri |
|
2021-04-12 14:04:00 |
Edward Hope-Morley |
bug task added |
|
cloud-archive/ussuri |
|
2021-04-12 21:24:18 |
Corey Bryant |
bug |
|
|
added subscriber Ubuntu Stable Release Updates Team |
2021-04-15 09:48:57 |
Hemanth Nakkina |
description |
If ovsdb-server is down for a while and we are connecting via SSL, python-ovs will raise
OpenSSL.SSL.SysCallError: (111, 'ECONNREFUSED')
instead of just returning an error type. If this goes on for a bit, then the Connection thread will exit and be unrecoverable without restarting neutron-server.
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
SRU:
[Impact]
Any intermittent connection issues between neutron-server and ovsdb nb/sb resulted in neutron-server not handling any more ovsdb transactions due to improper exception handling during reconnections. This further creates failures in post commit updates of resources and results in neutron/ovn db inconsistencies.
This fix catches the exceptions and retries to connect to ovsdb.
[Test plan]
* Deploy bionic-ussuri with neutron-server and ovn-central as HA using juju charms.
* Launch few instances and check if instances are in active state
* Simulated the network communication issues by modifying iptables related to ports 6641 6643 6644 16642
- On ovn-central/0, Dropping packets from ovn-central/2 and neutron-server/2
- On ovn-central/1, Dropping packets from ovn-central/2 and neutron-server/2
- On ovn-central/2, Dropping packets from ovn-central/0, ovn-central/1, neutron-server/0, neutron-server/1
DROP_PKTS_FROM_OVN_CENTRAL=
DROP_PKTS_FROM_NEUTRON_SERVER=
for ip in $DROP_PKTS_FROM_OVN_CENTRAL; do for port in 6641 6643 6644 16642; do iptables -I ufw-before-input 1 -s $ip -p tcp --dport $port -j REJECT; done; done
for ip in $DROP_PKTS_FROM_NEUTRON_SERVER; do for port in 6641 16642; do iptables -I ufw-before-input 1 -s $ip -p tcp --dport $port -j REJECT; done; done
* After a minute, drop the new REJECT rules added.
* Launch around 5 new VMs (5 to ensure some post creations to be landed on neutron-server/2) and look for Timeout Exceptions on neutron-server/2
If there are any Timeout exceptions, the neutron-server ovsdb connections are stale and not handling any more ovsdb transactions.
No Timeout exceptions and any port status updates from ovsdb implies neutron-server is successful in reconnection and started handling updates.
[Where problems could occur]
The fix passed the upstream zuul gates (tempest tests etc) and the patch just adds reconnection tries to ovsdbapp. So not expecting to introduce any regressions. |
If ovsdb-server is down for a while and we are connecting via SSL, python-ovs will raise
OpenSSL.SSL.SysCallError: (111, 'ECONNREFUSED')
instead of just returning an error type. If this goes on for a bit, then the Connection thread will exit and be unrecoverable without restarting neutron-server.
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
SRU:
[Impact]
Any intermittent connection issues between neutron-server and ovsdb nb/sb resulted in neutron-server not handling any more ovsdb transactions due to improper exception handling during reconnections. This further creates failures in post commit updates of resources and results in neutron/ovn db inconsistencies.
This fix catches the exceptions and retries to connect to ovsdb.
[Test plan]
* Deploy bionic-ussuri with neutron-server and ovn-central as HA using juju charms.
* Launch few instances and check if instances are in active state
* Simulated the network communication issues by modifying iptables related to ports 6641 6643 6644 16642
- On ovn-central/0, Dropping packets from ovn-central/2 and neutron-server/2
- On ovn-central/1, Dropping packets from ovn-central/2 and neutron-server/2
- On ovn-central/2, Dropping packets from ovn-central/0, ovn-central/1, neutron-server/0, neutron-server/1
DROP_PKTS_FROM_OVN_CENTRAL=
DROP_PKTS_FROM_NEUTRON_SERVER=
for ip in $DROP_PKTS_FROM_OVN_CENTRAL; do for port in 6641 6643 6644 16642; do iptables -I ufw-before-input 1 -s $ip -p tcp --dport $port -j REJECT; done; done
for ip in $DROP_PKTS_FROM_NEUTRON_SERVER; do for port in 6641 16642; do iptables -I ufw-before-input 1 -s $ip -p tcp --dport $port -j REJECT; done; done
* After a minute, drop the new REJECT rules added.
* Launch around 5 new VMs (5 to ensure some post creations to be landed on neutron-server/2) and look for Timeout Exceptions on neutron-server/2
If there are any Timeout exceptions, the neutron-server ovsdb connections are stale and not handling any more ovsdb transactions.
No Timeout exceptions and any port status updates from ovsdb implies neutron-server is successful in reconnection and started handling updates.
[Where problems could occur]
The fix passed the upstream zuul gates (tempest tests etc) and the patch just adds reconnection tries to ovsdbapp. The fix increases the reconnection attempts for every 4 minutes (3 min connection timeout + 1 min sleep) until the connection is successful. I dont see any regressions can happen with this change. |
|
2021-04-27 18:23:59 |
Brian Murray |
python-ovsdbapp (Ubuntu Groovy): status |
Confirmed |
Fix Committed |
|
2021-04-27 18:24:05 |
Brian Murray |
bug |
|
|
added subscriber SRU Verification |
2021-04-27 18:24:08 |
Brian Murray |
tags |
in-stable-train in-stable-ussuri in-stable-victoria sts |
in-stable-train in-stable-ussuri in-stable-victoria sts verification-needed verification-needed-groovy |
|
2021-04-27 18:25:00 |
Brian Murray |
python-ovsdbapp (Ubuntu Focal): status |
Confirmed |
Fix Committed |
|
2021-04-27 18:25:06 |
Brian Murray |
tags |
in-stable-train in-stable-ussuri in-stable-victoria sts verification-needed verification-needed-groovy |
in-stable-train in-stable-ussuri in-stable-victoria sts verification-needed verification-needed-focal verification-needed-groovy |
|
2021-04-27 19:53:02 |
Corey Bryant |
cloud-archive/ussuri: status |
New |
Fix Committed |
|
2021-04-27 19:53:04 |
Corey Bryant |
tags |
in-stable-train in-stable-ussuri in-stable-victoria sts verification-needed verification-needed-focal verification-needed-groovy |
in-stable-train in-stable-ussuri in-stable-victoria sts verification-needed verification-needed-focal verification-needed-groovy verification-ussuri-needed |
|
2021-04-27 20:27:15 |
Corey Bryant |
cloud-archive/victoria: status |
New |
Fix Committed |
|
2021-04-27 20:27:16 |
Corey Bryant |
tags |
in-stable-train in-stable-ussuri in-stable-victoria sts verification-needed verification-needed-focal verification-needed-groovy verification-ussuri-needed |
in-stable-train in-stable-ussuri in-stable-victoria sts verification-needed verification-needed-focal verification-needed-groovy verification-ussuri-needed verification-victoria-needed |
|
2021-04-27 20:28:14 |
Corey Bryant |
cloud-archive: status |
New |
Fix Released |
|
2021-05-07 07:52:23 |
Hemanth Nakkina |
tags |
in-stable-train in-stable-ussuri in-stable-victoria sts verification-needed verification-needed-focal verification-needed-groovy verification-ussuri-needed verification-victoria-needed |
in-stable-train in-stable-ussuri in-stable-victoria sts verification-done verification-done-focal verification-done-groovy verification-ussuri-done verification-victoria-done |
|
2021-05-10 08:29:21 |
Launchpad Janitor |
python-ovsdbapp (Ubuntu Groovy): status |
Fix Committed |
Fix Released |
|
2021-05-10 08:29:24 |
Łukasz Zemczak |
removed subscriber Ubuntu Stable Release Updates Team |
|
|
|
2021-05-10 08:34:55 |
Launchpad Janitor |
python-ovsdbapp (Ubuntu Focal): status |
Fix Committed |
Fix Released |
|
2021-05-10 12:13:40 |
Corey Bryant |
cloud-archive/victoria: status |
Fix Committed |
Fix Released |
|
2021-05-10 12:14:27 |
Corey Bryant |
cloud-archive/ussuri: status |
Fix Committed |
Fix Released |
|