hello, the issue of timeout repair neutron has been solved by update the metadata port manually before repair. But after repair neutron, the network agent still not stable (often 504 gateway timeout). $ openstack network agent list "HttpException: 504: Server Error for url: https://xxxx/v2.0/agents, The server didn't respond in time.: 504 Gateway Time-out" this log showed up in neutron-server.log : 2023-07-18 08:19:01.892 670 ERROR neutron.api.v2.resource [req-84aa9a81-2eb9-45b4-9353-7f395134041d 3fe50ccef00f49e3b1b0bbd58705a930 c7d2001e7a2c4c32b9f2a3657f29b6b0 - default default] index failed: No details.: ovsdbapp.exceptions.TimeoutExcept ion: Commands [] exceeded timeout 180 seconds 2023-07-18 08:19:01.892 670 ERROR neutron.api.v2.resource Traceback (most recent call last): 2023-07-18 08:19:01.892 670 ERROR neutron.api.v2.resource File "/var/lib/kolla/venv/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 153, in queue_txn 2023-07-18 08:19:01.892 670 ERROR neutron.api.v2.resource self.txns.put(txn, timeout=self.timeout) 2023-07-18 08:19:01.892 670 ERROR neutron.api.v2.resource File "/var/lib/kolla/venv/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 51, in put 2023-07-18 08:19:01.892 670 ERROR neutron.api.v2.resource super(TransactionQueue, self).put(*args, **kwargs) 2023-07-18 08:19:01.892 670 ERROR neutron.api.v2.resource File "/var/lib/kolla/venv/lib/python3.6/site-packages/eventlet/queue.py", line 264, in put 2023-07-18 08:19:01.892 670 ERROR neutron.api.v2.resource result = waiter.wait() 2023-07-18 08:19:01.892 670 ERROR neutron.api.v2.resource File "/var/lib/kolla/venv/lib/python3.6/site-packages/eventlet/queue.py", line 141, in wait 2023-07-18 08:19:01.892 670 ERROR neutron.api.v2.resource return get_hub().switch() 2023-07-18 08:19:01.892 670 ERROR neutron.api.v2.resource File "/var/lib/kolla/venv/lib/python3.6/site-packages/eventlet/hubs/hub.py", line 298, in switch 2023-07-18 08:19:01.892 670 ERROR neutron.api.v2.resource return self.greenlet.switch() 2023-07-18 08:19:01.892 670 ERROR neutron.api.v2.resource queue.Full 2023-07-18 08:19:01.892 670 ERROR neutron.api.v2.resource 2023-07-18 08:19:01.892 670 ERROR neutron.api.v2.resource During handling of the above exception, another exception occurred: 2023-07-18 08:19:01.892 670 ERROR neutron.api.v2.resource 2023-07-18 08:19:01.892 670 ERROR neutron.api.v2.resource Traceback (most recent call last): 2023-07-18 08:19:01.892 670 ERROR neutron.api.v2.resource File "/var/lib/kolla/venv/lib/python3.6/site-packages/neutron/api/v2/resource.py", line 98, in resource 2023-07-18 08:19:01.892 670 ERROR neutron.api.v2.resource result = method(request=request, **args) 2023-07-18 08:19:01.892 670 ERROR neutron.api.v2.resource File "/var/lib/kolla/venv/lib/python3.6/site-packages/neutron_lib/db/api.py", line 139, in wrapped 2023-07-18 08:19:01.892 670 ERROR neutron.api.v2.resource setattr(e, '_RETRY_EXCEEDED', True) 2023-07-18 08:19:01.892 670 ERROR neutron.api.v2.resource File "/var/lib/kolla/venv/lib/python3.6/site-packages/oslo_utils/excutils.py", line 220, in __exit__ 2023-07-18 08:19:01.892 670 ERROR neutron.api.v2.resource self.force_reraise() 2023-07-18 08:19:01.892 670 ERROR neutron.api.v2.resource File "/var/lib/kolla/venv/lib/python3.6/site-packages/oslo_utils/excutils.py", line 196, in force_reraise 2023-07-18 08:19:01.892 670 ERROR neutron.api.v2.resource six.reraise(self.type_, self.value, self.tb) 2023-07-18 08:19:01.892 670 ERROR neutron.api.v2.resource File "/usr/local/lib/python3.6/dist-packages/six.py", line 703, in reraise 2023-07-18 08:19:01.892 670 ERROR neutron.api.v2.resource raise value 2023-07-18 08:19:01.892 670 ERROR neutron.api.v2.resource File "/var/lib/kolla/venv/lib/python3.6/site-packages/neutron_lib/db/api.py", line 135, in wrapped 2023-07-18 08:19:01.892 670 ERROR neutron.api.v2.resource return f(*args, **kwargs) 2023-07-18 08:19:01.892 670 ERROR neutron.api.v2.resource File "/var/lib/kolla/venv/lib/python3.6/site-packages/oslo_db/api.py", line 154, in wrapper 2023-07-18 08:19:01.892 670 ERROR neutron.api.v2.resource ectxt.value = e.inner_exc 2023-07-18 08:19:01.892 670 ERROR neutron.api.v2.resource File "/var/lib/kolla/venv/lib/python3.6/site-packages/oslo_utils/excutils.py", line 220, in __exit__ 2023-07-18 08:19:01.892 670 ERROR neutron.api.v2.resource self.force_reraise() 2023-07-18 08:19:01.892 670 ERROR neutron.api.v2.resource File "/var/lib/kolla/venv/lib/python3.6/site-packages/oslo_utils/excutils.py", line 196, in force_reraise 2023-07-18 08:19:01.892 670 ERROR neutron.api.v2.resource six.reraise(self.type_, self.value, self.tb) 2023-07-18 08:19:01.892 670 ERROR neutron.api.v2.resource File "/usr/local/lib/python3.6/dist-packages/six.py", line 703, in reraise 2023-07-18 08:19:01.892 670 ERROR neutron.api.v2.resource raise value 2023-07-18 08:19:01.892 670 ERROR neutron.api.v2.resource File "/var/lib/kolla/venv/lib/python3.6/site-packages/oslo_db/api.py", line 142, in wrapper 2023-07-18 08:19:01.892 670 ERROR neutron.api.v2.resource return f(*args, **kwargs) 2023-07-18 08:19:01.892 670 ERROR neutron.api.v2.resource File "/var/lib/kolla/venv/lib/python3.6/site-packages/neutron_lib/db/api.py", line 183, in wrapped 2023-07-18 08:19:01.892 670 ERROR neutron.api.v2.resource LOG.debug("Retry wrapper got retriable exception: %s", e) 2023-07-18 08:19:01.892 670 ERROR neutron.api.v2.resource File "/var/lib/kolla/venv/lib/python3.6/site-packages/oslo_utils/excutils.py", line 220, in __exit__ 2023-07-18 08:19:01.892 670 ERROR neutron.api.v2.resource self.force_reraise() 2023-07-18 08:19:01.892 670 ERROR neutron.api.v2.resource File "/var/lib/kolla/venv/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 53, in commit 2023-07-18 08:19:01.892 670 ERROR neutron.api.v2.resource self.ovsdb_connection.queue_txn(self) 2023-07-18 08:19:01.892 670 ERROR neutron.api.v2.resource File "/var/lib/kolla/venv/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 156, in queue_txn 2023-07-18 08:19:01.892 670 ERROR neutron.api.v2.resource timeout=self.timeout) 2023-07-18 08:19:01.892 670 ERROR neutron.api.v2.resource ovsdbapp.exceptions.TimeoutException: Commands [] exceeded timeout 180 seconds 2023-07-18 08:19:01.892 670 ERROR neutron.api.v2.resource 2023-07-18 08:19:01.894 670 INFO neutron.wsgi [req-84aa9a81-2eb9-45b4-9353-7f395134041d 3fe50ccef00f49e3b1b0bbd58705a930 c7d2001e7a2c4c32b9f2a3657f29b6b0 - default default] Traceback (most recent call last): File "/var/lib/kolla/venv/lib/python3.6/site-packages/eventlet/wsgi.py", line 604, in handle_one_response write(b''.join(towrite)) File "/var/lib/kolla/venv/lib/python3.6/site-packages/eventlet/wsgi.py", line 538, in write wfile.flush() File "/usr/lib/python3.6/socket.py", line 604, in write return self._sock.send(b) File "/var/lib/kolla/venv/lib/python3.6/site-packages/eventlet/greenio/base.py", line 396, in send return self._send_loop(self.fd.send, data, flags) File "/var/lib/kolla/venv/lib/python3.6/site-packages/eventlet/greenio/base.py", line 383, in _send_loop return send_method(data, *args) ConnectionResetError: [Errno 104] Connection reset by peer this is the ovn-sb and ovn-nb log : root@xxx:~# tail -f /var/log/kolla/openvswitch/ovn-nb-db.log 2023-07-18T01:21:04.328Z|07042|reconnect|ERR|tcp:xxxx:49146: no response to inactivity probe after 5.29 seconds, disconnecting 2023-07-18T01:21:06.997Z|07043|reconnect|ERR|tcp:xxxx:49148: no response to inactivity probe after 5 seconds, disconnecting 2023-07-18T01:21:16.405Z|07044|reconnect|ERR|tcp:xxxx:32964: no response to inactivity probe after 5.08 seconds, disconnecting 2023-07-18T01:21:18.071Z|07045|reconnect|ERR|tcp:xxxx:55250: no response to inactivity probe after 5.08 seconds, disconnecting 2023-07-18T01:21:24.407Z|07046|reconnect|ERR|tcp:xxxx:55278: no response to inactivity probe after 5 seconds, disconnecting 2023-07-18T01:21:26.740Z|07047|reconnect|ERR|tcp:xxxx:55292: no response to inactivity probe after 5 seconds, disconnecting 2023-07-18T01:21:37.151Z|07048|reconnect|ERR|tcp:xxxx:41624: no response to inactivity probe after 5.08 seconds, disconnecting root@xxxx:~# tail -f /var/log/kolla/openvswitch/ovn-sb-db.log 2023-07-18T01:18:09.080Z|01544|reconnect|ERR|tcp:xxxx:48630: no response to inactivity probe after 5 seconds, disconnecting 2023-07-18T01:19:04.192Z|01545|reconnect|ERR|tcp:xxxx:44082: no response to inactivity probe after 5.29 seconds, disconnecting 2023-07-18T01:19:11.866Z|01546|reconnect|ERR|tcp:xxxx:44104: no response to inactivity probe after 5 seconds, disconnecting 2023-07-18T01:19:31.545Z|01547|reconnect|ERR|tcp:xxxx:55158: no response to inactivity probe after 5 seconds, disconnecting 2023-07-18T01:20:04.567Z|01548|reconnect|ERR|tcp:xxxx:57464: no response to inactivity probe after 5 seconds, disconnecting i already set the interval but the inactivity probe logs still showed up. ovs-vsctl set open . external_ids:ovn-remote-probe-interval="180000" docker exec ovn_nb_db ovn-nbctl set NB_Global . options:northd_probe_interval=180000 Does the neutron logs and the ovn logs related which causes the network agent to frequently timeout?