Here is the openstack and ovn version. I tried to launch instance with vlan network and got this error. Openstack version: Victoria ovn-controller 20.03.1 Open vSwitch Library 2.13.1 OpenFlow versions 0x4:0x4 kolla-ansible: 11.0.0 Anyway I tried to create geneve network and got the error below as well. 2023-09-28 09:26:39.294 742 INFO ovsdbapp.backend.ovs_idl.vlog [req-27994a40-4c29-4f4f-aeaf-05c9a7e72f46 - - - - -] tcp:100.127.234.13:6642: connection closed by client 2023-09-28 09:26:39.295 742 INFO ovsdbapp.backend.ovs_idl.vlog [req-27994a40-4c29-4f4f-aeaf-05c9a7e72f46 - - - - -] tcp:100.127.234.13:6642: continuing to reconnect in the background but suppressing further logging 2023-09-28 09:26:39.295 737 INFO ovsdbapp.backend.ovs_idl.vlog [req-df52297b-d7ca-4dc2-8fe2-b57ff67b6e9e - - - - -] tcp:100.127.234.13:6642: clustered database server is not cluster leader; trying another server 2023-09-28 09:26:39.295 739 INFO ovsdbapp.backend.ovs_idl.vlog [req-b1aa0028-1d36-45a1-b84c-89a0fced1bb5 - - - - -] tcp:100.127.234.13:6642: connection closed by client 2023-09-28 09:26:39.295 739 INFO ovsdbapp.backend.ovs_idl.vlog [req-b1aa0028-1d36-45a1-b84c-89a0fced1bb5 - - - - -] tcp:100.127.234.13:6642: continuing to reconnect in the background but suppressing further logging 2023-09-28 09:26:39.296 737 INFO ovsdbapp.backend.ovs_idl.vlog [req-df52297b-d7ca-4dc2-8fe2-b57ff67b6e9e - - - - -] tcp:100.127.234.13:6642: connection closed by client 2023-09-28 09:26:39.297 737 INFO ovsdbapp.backend.ovs_idl.vlog [req-df52297b-d7ca-4dc2-8fe2-b57ff67b6e9e - - - - -] tcp:100.127.234.13:6642: continuing to reconnect in the background but suppressing further logging 2023-09-28 09:26:39.306 738 INFO ovsdbapp.backend.ovs_idl.vlog [req-9cf45967-c9c0-43c4-aa5a-94c6e05d5b41 - - - - -] tcp:100.127.234.12:6641: connected 2023-09-28 09:26:39.354 737 INFO ovsdbapp.backend.ovs_idl.vlog [req-a21f58d5-0575-4b19-a5ce-619954040e8b - - - - -] tcp:100.127.234.12:6641: connected 2023-09-28 09:26:39.398 723 INFO ovsdbapp.backend.ovs_idl.vlog [req-e9961355-f1bc-4147-8bbe-268c7623d05e - - - - -] tcp:100.127.234.12:6641: connected 2023-09-28 09:26:39.459 722 INFO ovsdbapp.backend.ovs_idl.vlog [req-7e70a1de-9757-4966-b887-637d3c289577 - - - - -] tcp:100.127.234.12:6642: connecting... 2023-09-28 09:26:39.460 722 INFO ovsdbapp.backend.ovs_idl.vlog [req-7e70a1de-9757-4966-b887-637d3c289577 - - - - -] tcp:100.127.234.12:6642: connected 2023-09-28 09:26:39.676 747 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:100.127.234.12:6641: connected 2023-09-28 09:26:39.724 747 ERROR ovsdbapp.backend.ovs_idl.transaction [-] OVSDB Error: The transaction failed because the IDL has been configured to require a database lock but didn't get it yet or has already lost it 2023-09-28 09:26:39.725 747 ERROR ovsdbapp.backend.ovs_idl.transaction [req-ea36722f-0be8-4b0a-af2e-7fb336538969 - - - - -] Traceback (most recent call last): File "/usr/lib/python3/dist-packages/ovsdbapp/backend/ovs_idl/connection.py", line 122, in run txn.results.put(txn.do_commit()) File "/usr/lib/python3/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 119, in do_commit raise RuntimeError(msg) RuntimeError: OVSDB Error: The transaction failed because the IDL has been configured to require a database lock but didn't get it yet or has already lost it 2023-09-28 09:26:39.725 747 ERROR neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.maintenance [req-ea36722f-0be8-4b0a-af2e-7fb336538969 - - - - -] Maintenance task: Failed to fix resource a4c9cf80-3143-42bb-b90a-8c0df2526b7b (type: ports): RuntimeError: OVSDB Error: The transaction failed because the IDL has been configured to require a database lock but didn't get it yet or has already lost it 2023-09-28 09:26:39.725 747 ERROR neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.maintenance Traceback (most recent call last): 2023-09-28 09:26:39.725 747 ERROR neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.maintenance File "/usr/lib/python3/dist-packages/neutron/plugins/ml2/drivers/ovn/mech_driver/ovsdb/maintenance.py", line 370, in check_for_inconsistencies 2023-09-28 09:26:39.725 747 ERROR neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.maintenance self._fix_create_update(admin_context, row) 2023-09-28 09:26:39.725 747 ERROR neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.maintenance File "/usr/lib/python3/dist-packages/neutron/plugins/ml2/drivers/ovn/mech_driver/ovsdb/maintenance.py", line 229, in _fix_create_update 2023-09-28 09:26:39.725 747 ERROR neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.maintenance res_map['ovn_create'](context, n_obj) 2023-09-28 09:26:39.725 747 ERROR neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.maintenance File "/usr/lib/python3/dist-packages/neutron/plugins/ml2/drivers/ovn/mech_driver/ovsdb/ovn_client.py", line 413, in create_port 2023-09-28 09:26:39.725 747 ERROR neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.maintenance self._qos_driver.create_port(txn, port) 2023-09-28 09:26:39.725 747 ERROR neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.maintenance File "/usr/lib/python3.8/contextlib.py", line 120, in __exit__ 2023-09-28 09:26:39.725 747 ERROR neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.maintenance next(self.gen) 2023-09-28 09:26:39.725 747 ERROR neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.maintenance File "/usr/lib/python3/dist-packages/neutron/plugins/ml2/drivers/ovn/mech_driver/ovsdb/impl_idl_ovn.py", line 185, in transaction 2023-09-28 09:26:39.725 747 ERROR neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.maintenance yield t 2023-09-28 09:26:39.725 747 ERROR neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.maintenance File "/usr/lib/python3.8/contextlib.py", line 120, in __exit__ 2023-09-28 09:26:39.725 747 ERROR neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.maintenance next(self.gen) 2023-09-28 09:26:39.725 747 ERROR neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.maintenance File "/usr/lib/python3/dist-packages/ovsdbapp/api.py", line 110, in transaction 2023-09-28 09:26:39.725 747 ERROR neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.maintenance del self._nested_txns_map[cur_thread_id] 2023-09-28 09:26:39.725 747 ERROR neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.maintenance File "/usr/lib/python3/dist-packages/ovsdbapp/api.py", line 61, in __exit__ 2023-09-28 09:26:39.725 747 ERROR neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.maintenance self.result = self.commit() 2023-09-28 09:26:39.725 747 ERROR neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.maintenance File "/usr/lib/python3/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 63, in commit 2023-09-28 09:26:39.725 747 ERROR neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.maintenance raise result.ex 2023-09-28 09:26:39.725 747 ERROR neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.maintenance File "/usr/lib/python3/dist-packages/ovsdbapp/backend/ovs_idl/connection.py", line 122, in run 2023-09-28 09:26:39.725 747 ERROR neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.maintenance txn.results.put(txn.do_commit()) 2023-09-28 09:26:39.725 747 ERROR neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.maintenance File "/usr/lib/python3/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 119, in do_commit 2023-09-28 09:26:39.725 747 ERROR neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.maintenance raise RuntimeError(msg) 2023-09-28 09:26:39.725 747 ERROR neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.maintenance RuntimeError: OVSDB Error: The transaction failed because the IDL has been configured to require a database lock but didn't get it yet or has already lost it 2023-09-28 09:26:39.725 747 ERROR neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.maintenance 2023-09-28 09:26:39.762 740 INFO ovsdbapp.backend.ovs_idl.vlog [req-f2e190a4-1734-4edf-ad87-bf48a4e0e49f - - - - -] tcp:100.127.234.13:6642: connection closed by peer 2023-09-28 09:26:39.807 747 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:100.127.234.13:6642: clustered database server is not cluster leader; trying another server