StaleDataError: UPDATE statement on table 'standardattributes' expected to update 1 row(s); 0 were matched.

Bug #1716321 reported by Christoph Fiehe
52
This bug affects 15 people
Affects Status Importance Assigned to Milestone
neutron
Fix Released
Undecided
Unassigned

Bug Description

I have manually setup an HA/DVR deployment of OpenStack Pike on Ubuntu 16.04. The neutron.log gives me periodallcy the following exception:

2017-09-11 07:50:34.655 5830 ERROR oslo_db.api [req-ed871530-109c-475a-8b52-9422b156d24b - - - - -] DB exceeded retry limit.: StaleDataError: UPDATE statement on table 'standardattributes' expected to update 1 row(s); 0 were matched.
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api Traceback (most recent call last):
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api File "/usr/lib/python2.7/dist-packages/oslo_db/api.py", line 138, in wrapper
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api return f(*args, **kwargs)
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api File "/usr/lib/python2.7/dist-packages/neutron/db/api.py", line 128, in wrapped
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api LOG.debug("Retry wrapper got retriable exception: %s", e)
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api File "/usr/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 220, in __exit__
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api self.force_reraise()
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api File "/usr/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 196, in force_reraise
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api six.reraise(self.type_, self.value, self.tb)
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api File "/usr/lib/python2.7/dist-packages/neutron/db/api.py", line 124, in wrapped
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api return f(*dup_args, **dup_kwargs)
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api File "/usr/lib/python2.7/dist-packages/neutron/plugins/ml2/plugin.py", line 1699, in update_port_statuses
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api context, port_dbs_by_id[port_id], status, host)
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api File "/usr/lib/python2.7/dist-packages/neutron/plugins/ml2/plugin.py", line 1711, in _safe_update_individual_port_db_status
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api ectx.reraise = bool(db.get_port(context, port_id))
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api File "/usr/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 220, in __exit__
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api self.force_reraise()
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api File "/usr/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 196, in force_reraise
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api six.reraise(self.type_, self.value, self.tb)
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api File "/usr/lib/python2.7/dist-packages/neutron/plugins/ml2/plugin.py", line 1707, in _safe_update_individual_port_db_status
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api context, port, status, host)
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api File "/usr/lib/python2.7/dist-packages/neutron/plugins/ml2/plugin.py", line 1750, in _update_individual_port_db_status
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api updated = True
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api File "/usr/lib/python2.7/contextlib.py", line 24, in __exit__
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api self.gen.next()
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api File "/usr/lib/python2.7/dist-packages/oslo_db/sqlalchemy/enginefacade.py", line 1029, in _transaction_scope
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api yield resource
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api File "/usr/lib/python2.7/contextlib.py", line 24, in __exit__
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api self.gen.next()
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api File "/usr/lib/python2.7/dist-packages/oslo_db/sqlalchemy/enginefacade.py", line 641, in _session
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api self.session.rollback()
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api File "/usr/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 220, in __exit__
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api self.force_reraise()
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api File "/usr/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 196, in force_reraise
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api six.reraise(self.type_, self.value, self.tb)
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api File "/usr/lib/python2.7/dist-packages/oslo_db/sqlalchemy/enginefacade.py", line 638, in _session
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api self._end_session_transaction(self.session)
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api File "/usr/lib/python2.7/dist-packages/oslo_db/sqlalchemy/enginefacade.py", line 666, in _end_session_transaction
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api session.commit()
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api File "/usr/lib/python2.7/dist-packages/sqlalchemy/orm/session.py", line 874, in commit
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api self.transaction.commit()
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api File "/usr/lib/python2.7/dist-packages/sqlalchemy/orm/session.py", line 461, in commit
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api self._prepare_impl()
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api File "/usr/lib/python2.7/dist-packages/sqlalchemy/orm/session.py", line 441, in _prepare_impl
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api self.session.flush()
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api File "/usr/lib/python2.7/dist-packages/sqlalchemy/orm/session.py", line 2139, in flush
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api self._flush(objects)
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api File "/usr/lib/python2.7/dist-packages/sqlalchemy/orm/session.py", line 2259, in _flush
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api transaction.rollback(_capture_exception=True)
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api File "/usr/lib/python2.7/dist-packages/sqlalchemy/util/langhelpers.py", line 66, in __exit__
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api compat.reraise(exc_type, exc_value, exc_tb)
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api File "/usr/lib/python2.7/dist-packages/sqlalchemy/orm/session.py", line 2223, in _flush
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api flush_context.execute()
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api File "/usr/lib/python2.7/dist-packages/sqlalchemy/orm/unitofwork.py", line 389, in execute
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api rec.execute(self)
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api File "/usr/lib/python2.7/dist-packages/sqlalchemy/orm/unitofwork.py", line 548, in execute
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api uow
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api File "/usr/lib/python2.7/dist-packages/sqlalchemy/orm/persistence.py", line 177, in save_obj
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api mapper, table, update)
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api File "/usr/lib/python2.7/dist-packages/sqlalchemy/orm/persistence.py", line 760, in _emit_update_statements
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api (table.description, len(records), rows))
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api StaleDataError: UPDATE statement on table 'standardattributes' expected to update 1 row(s); 0 were matched.
2017-09-11 07:50:34.655 5830 ERROR oslo_db.api
2017-09-11 07:50:34.657 5830 ERROR neutron.plugins.ml2.rpc [req-ed871530-109c-475a-8b52-9422b156d24b - - - - -] Failed to update device c34b99f0-fdf4-4cb4-b218-33d01de26af4 up: StaleDataError: UPDATE statement on table 'standardattributes' expected to update 1 row(s); 0 were matched.

I am using MariaDB:
$ mysql --version
mysql Ver 15.1 Distrib 10.1.26-MariaDB, for debian-linux-gnu (x86_64) using readline 5.2

I can also offer you an abstract from the database showing that all relevant information is available:

INSERT INTO `ports` (`project_id`, `id`, `name`, `network_id`, `mac_address`, `admin_state_up`, `status`, `device_id`, `device_owner`, `standard_attr_id`, `ip_allocation`) VALUES ('ed2d2efd86dd40e7a45491d8502318d3', 'c34b99f0-fdf4-4cb4-b218-33d01de26af4', '', '18e47141-f3f3-4071-a51d-f93d80a2bacd', '02:05:69:e7:e7:ba', 1, 'ACTIVE', 'f7f8682e-ae8a-49f4-a00f-5530cb8865bc', 'network:router_interface_distributed', 258, 'immediate');

INSERT INTO `standardattributes` (`id`, `resource_type`, `created_at`, `updated_at`, `description`, `revision_number`) VALUES (258, 'ports', '2017-09-07 08:29:28', '2017-09-11 05:39:22', '', 1150832);

Can someone please have a look on this issue.

Revision history for this message
Swaminathan Vasudevan (swaminathan-vasudevan) wrote :

I think there is a fix that was proposed by @kevinbenton for this.

Revision history for this message
Swaminathan Vasudevan (swaminathan-vasudevan) wrote :
Revision history for this message
Boden R (boden) wrote :

Moving to incomplete.
We need the originator to try the patch mentioned in comment #2; which appears to have already landed.

If this issue persists, please reopen in the "New" state.

Changed in neutron:
status: New → Incomplete
Revision history for this message
Christoph Fiehe (fiehe) wrote :
Changed in neutron:
status: Incomplete → Fix Released
Revision history for this message
charlie (chz8494) wrote :

doesn't look like works for me, I even tried with latest version available on github, if I create 1 vm and attach it onto vxlan network, it will still same error.

Revision history for this message
shihanzhang (shihanzhang) wrote :

i have same problem with this patch:https://review.openstack.org/#/c/503517/, and this problem will lead to incorrect ovs flow pipline for dvr on compute node.

Revision history for this message
OpenStack Infra (hudson-openstack) wrote : Fix proposed to neutron (master)

Fix proposed to branch: master
Review: https://review.openstack.org/534449

Revision history for this message
Sridhar Venkat (svenkat) wrote :
Download full text (6.2 KiB)

After patching my environment with code changes from https://review.openstack.org/534449, I am still seeing this error:

2018-01-28 03:49:08.621 5932 ERROR oslo_db.api Traceback (most recent call last):
2018-01-28 03:49:08.621 5932 ERROR oslo_db.api File "/usr/lib/python2.7/site-packages/oslo_db/api.py", line 138, in wrapper
2018-01-28 03:49:08.621 5932 ERROR oslo_db.api return f(*args, **kwargs)
2018-01-28 03:49:08.621 5932 ERROR oslo_db.api File "/usr/lib/python2.7/site-packages/neutron/db/api.py", line 128, in wrapped
2018-01-28 03:49:08.621 5932 ERROR oslo_db.api LOG.debug("Retry wrapper got retriable exception: %s", e)
2018-01-28 03:49:08.621 5932 ERROR oslo_db.api File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 220, in __exit__
2018-01-28 03:49:08.621 5932 ERROR oslo_db.api self.force_reraise()
2018-01-28 03:49:08.621 5932 ERROR oslo_db.api File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 196, in force_reraise
2018-01-28 03:49:08.621 5932 ERROR oslo_db.api six.reraise(self.type_, self.value, self.tb)
2018-01-28 03:49:08.621 5932 ERROR oslo_db.api File "/usr/lib/python2.7/site-packages/neutron/db/api.py", line 124, in wrapped
2018-01-28 03:49:08.621 5932 ERROR oslo_db.api return f(*dup_args, **dup_kwargs)
2018-01-28 03:49:08.621 5932 ERROR oslo_db.api File "/usr/lib/python2.7/site-packages/neutron/plugins/ml2/plugin.py", line 1278, in create_port
2018-01-28 03:49:08.621 5932 ERROR oslo_db.api result, mech_context = self._create_port_db(context, port)
2018-01-28 03:49:08.621 5932 ERROR oslo_db.api File "/usr/lib/python2.7/site-packages/neutron/plugins/ml2/plugin.py", line 1253, in _create_port_db
2018-01-28 03:49:08.621 5932 ERROR oslo_db.api network = self.get_network(context, result['network_id'])
2018-01-28 03:49:08.621 5932 ERROR oslo_db.api File "/usr/lib/python2.7/site-packages/neutron/db/api.py", line 163, in wrapped
2018-01-28 03:49:08.621 5932 ERROR oslo_db.api return method(*args, **kwargs)
2018-01-28 03:49:08.621 5932 ERROR oslo_db.api File "/usr/lib/python2.7/site-packages/neutron/plugins/ml2/plugin.py", line 996, in get_network
2018-01-28 03:49:08.621 5932 ERROR oslo_db.api self.type_manager.extend_network_dict_provider(context, net_data)
2018-01-28 03:49:08.621 5932 ERROR oslo_db.api File "/usr/lib/python2.7/site-packages/neutron/plugins/ml2/managers.py", line 154, in extend_network_dict_provider
2018-01-28 03:49:08.621 5932 ERROR oslo_db.api return self.extend_networks_dict_provider(context, [network])
2018-01-28 03:49:08.621 5932 ERROR oslo_db.api File "/usr/lib/python2.7/site-packages/neutron/plugins/ml2/managers.py", line 158, in extend_networks_dict_provider
2018-01-28 03:49:08.621 5932 ERROR oslo_db.api net_segments = segments_db.get_networks_segments(context, ids)
2018-01-28 03:49:08.621 5932 ERROR oslo_db.api File "/usr/lib/python2.7/site-packages/neutron/db/segments_db.py", line 79, in get_networks_segments
2018-01-28 03:49:08.621 5932 ERROR oslo_db.api objs = network_obj.NetworkSegment.get_objects(context, **filters)
2018-01-28 03:49:08.621 5932 ERROR oslo_db.api File "/usr/lib/python2.7/site-packages/...

Read more...

Revision history for this message
Piotr Misiak (piotr-misiak) wrote :
Download full text (21.6 KiB)

I have the same ERROR messages in Neutron 11.0.3.

When I'm trying to spawn 100 VMs at once, several of them ends in ERROR state.

For every VM in ERROR state I have in nova-compute logs:

2018-03-22 17:06:26.148 7 WARNING nova.virt.libvirt.driver [req-ba580644-629a-4bec-afce-9d92b50c10de 726c3421bafc468ab0e2db45fc18f1e7 7ff6681fd929452d885184e39b1ba75d - default default] [instance: 910d4ef1-fcc5-46a6-8f87-c029e45059fd] Timeout waiting for vif plugging callback for instance with vm_state building and task_state spawning.: Timeout: 300 seconds

The interesting thing is that ERROR state is always on VMs spawned one after another, for example for VMs from test-11 to test-16. So it looks like it didn't worked only in one point of time.

In neutron-server logs I have ERRORs:

2018-03-22 17:01:56.365 46 ERROR oslo_db.api [req-87546baa-6412-4c3e-8964-0bbc055beec1 - - - - -] DB exceeded retry limit.: StaleDataError: UPDATE statement on table 'standardattributes' expected to updat
e 1 row(s); 0 were matched.
2018-03-22 17:01:56.365 46 ERROR oslo_db.api Traceback (most recent call last):
2018-03-22 17:01:56.365 46 ERROR oslo_db.api File "/var/lib/kolla/venv/local/lib/python2.7/site-packages/oslo_db/api.py", line 138, in wrapper
2018-03-22 17:01:56.365 46 ERROR oslo_db.api return f(*args, **kwargs)
2018-03-22 17:01:56.365 46 ERROR oslo_db.api File "/var/lib/kolla/venv/local/lib/python2.7/site-packages/neutron/db/api.py", line 128, in wrapped
2018-03-22 17:01:56.365 46 ERROR oslo_db.api LOG.debug("Retry wrapper got retriable exception: %s", e)
2018-03-22 17:01:56.365 46 ERROR oslo_db.api File "/var/lib/kolla/venv/local/lib/python2.7/site-packages/oslo_utils/excutils.py", line 220, in __exit__
2018-03-22 17:01:56.365 46 ERROR oslo_db.api self.force_reraise()
2018-03-22 17:01:56.365 46 ERROR oslo_db.api File "/var/lib/kolla/venv/local/lib/python2.7/site-packages/oslo_utils/excutils.py", line 196, in force_reraise
2018-03-22 17:01:56.365 46 ERROR oslo_db.api six.reraise(self.type_, self.value, self.tb)
2018-03-22 17:01:56.365 46 ERROR oslo_db.api File "/var/lib/kolla/venv/local/lib/python2.7/site-packages/neutron/db/api.py", line 124, in wrapped
2018-03-22 17:01:56.365 46 ERROR oslo_db.api return f(*dup_args, **dup_kwargs)
2018-03-22 17:01:56.365 46 ERROR oslo_db.api File "/var/lib/kolla/venv/local/lib/python2.7/site-packages/neutron/plugins/ml2/plugin.py", line 397, in _bind_port_if_needed
2018-03-22 17:01:56.365 46 ERROR oslo_db.api need_notify, try_again))
2018-03-22 17:01:56.365 46 ERROR oslo_db.api File "/var/lib/kolla/venv/local/lib/python2.7/site-packages/neutron/plugins/ml2/plugin.py", line 543, in _commit_port_binding
2018-03-22 17:01:56.365 46 ERROR oslo_db.api cur_binding.host)
2018-03-22 17:01:56.365 46 ERROR oslo_db.api File "/var/lib/kolla/venv/local/lib/python2.7/site-packages/oslo_db/sqlalchemy/enginefacade.py", line 979, in wrapper
2018-03-22 17:01:56.365 46 ERROR oslo_db.api return fn(*args, **kwargs)
2018-03-22 17:01:56.365 46 ERROR oslo_db.api File "/var/lib/kolla/venv/local/lib/python2.7/site-packages/neutron/plugins/ml2/db.py", line 114, in clear_binding_levels
2018-03-22 17:01:56.365 ...

Revision history for this message
Sridhar Venkat (svenkat) wrote :

Any updates on this defect?

Revision history for this message
Ryan Tidwell (ryan-tidwell) wrote :
Revision history for this message
Mithil Arun (arun-mithil) wrote :

Can we revisit this? I continue to see this in Rocky, where the l3 agent logs are flooded with messages like:

--snip--
2020-08-25 19:22:39.624 18102 ERROR neutron.agent.l3.agent [-] Failed to process compatible router: ea187315-b0c7-4f2e-98e9-128a923fca4e: oslo_messaging.rpc.client.RemoteError: Remote error: StaleDataError UPDATE statement on table 'standardattributes' expected to update 1 row(s); 0 were matched.
--snip--

Revision history for this message
Imran Hussain (imranh2) wrote :

I'm seeing this in stable/xena with DVR+HA L3 routers. I use Kolla to deploy my Stack.

From digging around the code I think it's just a race condition, I got a simple test case going.

This patch worked for me, specifically needs to be applied to neutron-lib that neutron-server uses. Doesn't really fix the issue but made my test case pass every time.

Patch:

--- /var/lib/kolla/venv/lib/python3.8/site-packages/neutron_lib/db/api.py 2022-11-03 12:30:52.630197309 +0000
+++ /var/lib/kolla/venv/lib/python3.8/site-packages/neutron_lib/db/api.py 2022-11-03 12:31:14.278481076 +0000
@@ -37,7 +37,7 @@
 from neutron_lib.objects import exceptions as obj_exc

-MAX_RETRIES = 20
+MAX_RETRIES = 50
 LOG = logging.getLogger(__name__)
 _synchronized = lockutils.synchronized_with_prefix("neutron-")
 _CTX_MANAGER = None

To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Duplicates of this bug

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.