OVSDB transaction returned TRY_AGAIN, retrying do_commit

Bug #2037500 reported by Attasit
6
This bug affects 1 person
Affects Status Importance Assigned to Milestone
neutron
Invalid
Undecided
Unassigned

Bug Description

Trying to create instance and got error when it's trying to attach the port to instance about details below on neutron server.

2023-09-27 09:58:10.725 716 DEBUG ovsdbapp.backend.ovs_idl.transaction [req-2df7a23e-8b9f-409c-a35e-9b78edb6bce1 - - - - -] OVSDB transaction returned TRY_AGAIN, retrying do_commit /usr/lib/python3/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:97
2023-09-27 09:58:10.724 723 ERROR neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.impl_idl_ovn [req-ca61167b-aca9-46a2-81fb-8f8e3ebba349 - - - - -] OVS database connection to OVN_Northbound failed with error: 'Timeout'. Verify that the OVS and OVN services are available and that the 'ovn_nb_connection' and 'ovn_sb_connection' configuration options are correct.: Exception: Timeout
2023-09-27 09:58:10.724 723 ERROR neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.impl_idl_ovn Traceback (most recent call last):
2023-09-27 09:58:10.724 723 ERROR neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.impl_idl_ovn File "/usr/lib/python3/dist-packages/neutron/plugins/ml2/drivers/ovn/mech_driver/ovsdb/impl_idl_ovn.py", line 68, in start_connection
2023-09-27 09:58:10.724 723 ERROR neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.impl_idl_ovn self.ovsdb_connection.start()
2023-09-27 09:58:10.724 723 ERROR neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.impl_idl_ovn File "/usr/lib/python3/dist-packages/ovsdbapp/backend/ovs_idl/connection.py", line 79, in start
2023-09-27 09:58:10.724 723 ERROR neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.impl_idl_ovn idlutils.wait_for_change(self.idl, self.timeout)
2023-09-27 09:58:10.724 723 ERROR neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.impl_idl_ovn File "/usr/lib/python3/dist-packages/ovsdbapp/backend/ovs_idl/idlutils.py", line 219, in wait_for_change
2023-09-27 09:58:10.724 723 ERROR neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.impl_idl_ovn raise Exception("Timeout") # TODO(twilson) use TimeoutException?
2023-09-27 09:58:10.724 723 ERROR neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.impl_idl_ovn Exception: Timeout
2023-09-27 09:58:10.724 723 ERROR neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.impl_idl_ovn

and later the error still come below.

2023-09-27 12:07:36.849 747 ERROR ovsdbapp.backend.ovs_idl.transaction [-] OVSDB Error: The transaction failed because the IDL has been configured to require a database lock but didn't get it yet or has already lost it
2023-09-27 12:07:36.849 747 ERROR ovsdbapp.backend.ovs_idl.transaction [req-7f9163da-8faf-4509-b650-aedfdf4ff303 - - - - -] Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/ovsdbapp/backend/ovs_idl/connection.py", line 122, in run
    txn.results.put(txn.do_commit())
  File "/usr/lib/python3/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 119, in do_commit
    raise RuntimeError(msg)
RuntimeError: OVSDB Error: The transaction failed because the IDL has been configured to require a database lock but didn't get it yet or has already lost it

2023-09-27 12:07:36.849 747 ERROR futurist.periodics [req-7f9163da-8faf-4509-b650-aedfdf4ff303 - - - - -] Failed to call periodic 'neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.maintenance.DBInconsistenciesPeriodics.check_for_ha_chassis_group_address' (it runs every 600.00 seconds): RuntimeError: OVSDB Error: The transaction failed because the IDL has been configured to require a database lock but didn't get it yet or has already lost it
2023-09-27 12:07:36.849 747 ERROR futurist.periodics Traceback (most recent call last):
2023-09-27 12:07:36.849 747 ERROR futurist.periodics File "/usr/lib/python3/dist-packages/futurist/periodics.py", line 293, in run
2023-09-27 12:07:36.849 747 ERROR futurist.periodics work()
2023-09-27 12:07:36.849 747 ERROR futurist.periodics File "/usr/lib/python3/dist-packages/futurist/periodics.py", line 67, in __call__
2023-09-27 12:07:36.849 747 ERROR futurist.periodics return self.callback(*self.args, **self.kwargs)
2023-09-27 12:07:36.849 747 ERROR futurist.periodics File "/usr/lib/python3/dist-packages/futurist/periodics.py", line 181, in decorator
2023-09-27 12:07:36.849 747 ERROR futurist.periodics return f(*args, **kwargs)
2023-09-27 12:07:36.849 747 ERROR futurist.periodics File "/usr/lib/python3/dist-packages/neutron/plugins/ml2/drivers/ovn/mech_driver/ovsdb/maintenance.py", line 622, in check_for_ha_chassis_group_address
2023-09-27 12:07:36.849 747 ERROR futurist.periodics priority -= 1
2023-09-27 12:07:36.849 747 ERROR futurist.periodics File "/usr/lib/python3.8/contextlib.py", line 120, in __exit__
2023-09-27 12:07:36.849 747 ERROR futurist.periodics next(self.gen)
2023-09-27 12:07:36.849 747 ERROR futurist.periodics File "/usr/lib/python3/dist-packages/neutron/plugins/ml2/drivers/ovn/mech_driver/ovsdb/impl_idl_ovn.py", line 185, in transaction

Tags: ovn
tags: added: ovn
Revision history for this message
Lajos Katona (lajos-katona) wrote (last edit ):

Thanks for reporting this issue.Could you give some help for reproduction:
* Openstack version, perhaps also OVN version. (by the logs it seem like perhaps zed, am I right?)
* Steps for reproduciton, What kind of network you used, what kind of port etc.

Changed in neutron:
status: New → Incomplete
Revision history for this message
Attasit (attasies) wrote :
Download full text (7.9 KiB)

Here is the openstack and ovn version. I tried to launch instance with vlan network and got this error.
Openstack version: Victoria
ovn-controller 20.03.1
Open vSwitch Library 2.13.1
OpenFlow versions 0x4:0x4
kolla-ansible: 11.0.0

Anyway I tried to create geneve network and got the error below as well.
2023-09-28 09:26:39.294 742 INFO ovsdbapp.backend.ovs_idl.vlog [req-27994a40-4c29-4f4f-aeaf-05c9a7e72f46 - - - - -] tcp:100.127.234.13:6642: connection closed by client
2023-09-28 09:26:39.295 742 INFO ovsdbapp.backend.ovs_idl.vlog [req-27994a40-4c29-4f4f-aeaf-05c9a7e72f46 - - - - -] tcp:100.127.234.13:6642: continuing to reconnect in the background but suppressing further logging
2023-09-28 09:26:39.295 737 INFO ovsdbapp.backend.ovs_idl.vlog [req-df52297b-d7ca-4dc2-8fe2-b57ff67b6e9e - - - - -] tcp:100.127.234.13:6642: clustered database server is not cluster leader; trying another server
2023-09-28 09:26:39.295 739 INFO ovsdbapp.backend.ovs_idl.vlog [req-b1aa0028-1d36-45a1-b84c-89a0fced1bb5 - - - - -] tcp:100.127.234.13:6642: connection closed by client
2023-09-28 09:26:39.295 739 INFO ovsdbapp.backend.ovs_idl.vlog [req-b1aa0028-1d36-45a1-b84c-89a0fced1bb5 - - - - -] tcp:100.127.234.13:6642: continuing to reconnect in the background but suppressing further logging
2023-09-28 09:26:39.296 737 INFO ovsdbapp.backend.ovs_idl.vlog [req-df52297b-d7ca-4dc2-8fe2-b57ff67b6e9e - - - - -] tcp:100.127.234.13:6642: connection closed by client
2023-09-28 09:26:39.297 737 INFO ovsdbapp.backend.ovs_idl.vlog [req-df52297b-d7ca-4dc2-8fe2-b57ff67b6e9e - - - - -] tcp:100.127.234.13:6642: continuing to reconnect in the background but suppressing further logging
2023-09-28 09:26:39.306 738 INFO ovsdbapp.backend.ovs_idl.vlog [req-9cf45967-c9c0-43c4-aa5a-94c6e05d5b41 - - - - -] tcp:100.127.234.12:6641: connected
2023-09-28 09:26:39.354 737 INFO ovsdbapp.backend.ovs_idl.vlog [req-a21f58d5-0575-4b19-a5ce-619954040e8b - - - - -] tcp:100.127.234.12:6641: connected
2023-09-28 09:26:39.398 723 INFO ovsdbapp.backend.ovs_idl.vlog [req-e9961355-f1bc-4147-8bbe-268c7623d05e - - - - -] tcp:100.127.234.12:6641: connected
2023-09-28 09:26:39.459 722 INFO ovsdbapp.backend.ovs_idl.vlog [req-7e70a1de-9757-4966-b887-637d3c289577 - - - - -] tcp:100.127.234.12:6642: connecting...
2023-09-28 09:26:39.460 722 INFO ovsdbapp.backend.ovs_idl.vlog [req-7e70a1de-9757-4966-b887-637d3c289577 - - - - -] tcp:100.127.234.12:6642: connected
2023-09-28 09:26:39.676 747 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:100.127.234.12:6641: connected
2023-09-28 09:26:39.724 747 ERROR ovsdbapp.backend.ovs_idl.transaction [-] OVSDB Error: The transaction failed because the IDL has been configured to require a database lock but didn't get it yet or has already lost it
2023-09-28 09:26:39.725 747 ERROR ovsdbapp.backend.ovs_idl.transaction [req-ea36722f-0be8-4b0a-af2e-7fb336538969 - - - - -] Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/ovsdbapp/backend/ovs_idl/connection.py", line 122, in run
    txn.results.put(txn.do_commit())
  File "/usr/lib/python3/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 119, in do_commit
    raise RuntimeError(msg)
RuntimeError: OVSDB Error:...

Read more...

Revision history for this message
Lajos Katona (lajos-katona) wrote :

Hi, I can't build an OVN based devstack with Victoria (or with wallaby), and I can't reproduce it on master.
Is it possible for you to try it on master (or on 2023.2)?

Revision history for this message
Attasit (attasies) wrote :

Hi, Sorry for lat reply. I upgrade OVN and neutron to be master then this issue is fixed.
Thank you for your support.

Changed in neutron:
status: Incomplete → Invalid
To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.