[neutron-db] neutron-server report error "UPDATE statement on table 'standardattributes' expected to update 1 row(s); 0 were matched'. It will be caused the ovs flow miss

Bug #1738337 reported by Zachary Ma
34
This bug affects 8 people
Affects Status Importance Assigned to Milestone
neutron
New
Low
Zachary Ma

Bug Description

4124:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api [req-3a38738f-efbf-45b0-ae65-3af84f6aae2c - - - - -] DB exceeded retry limit.: StaleDataError: UPDATE statement on table 'standardattributes' expected to update 1 row(s); 0 were matched.
4125:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api Traceback (most recent call last):
4126:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api File "/usr/lib/python2.7/site-packages/oslo_db/api.py", line 138, in wrapper
4127:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api return f(*args, **kwargs)
4128:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api File "/usr/lib/python2.7/site-packages/neutron/db/api.py", line 128, in wrapped
4129:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api LOG.debug("Retry wrapper got retriable exception: %s", e)
4130:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 220, in __exit__
4131:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api self.force_reraise()
4132:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 196, in force_reraise
4133:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api six.reraise(self.type_, self.value, self.tb)
4134:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api File "/usr/lib/python2.7/site-packages/neutron/db/api.py", line 124, in wrapped
4135:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api return f(*dup_args, **dup_kwargs)
4136:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api File "/usr/lib/python2.7/site-packages/neutron/plugins/ml2/plugin.py", line 1702, in update_port_statuses
4137:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api context, port_dbs_by_id[port_id], status, host)
4138:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api File "/usr/lib/python2.7/site-packages/neutron/plugins/ml2/plugin.py", line 1714, in _safe_update_individual_port_db_status
4139:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api ectx.reraise = bool(db.get_port(context, port_id))
4140:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 220, in __exit__
4141:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api self.force_reraise()
4142:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 196, in force_reraise
4143:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api six.reraise(self.type_, self.value, self.tb)
4144:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api File "/usr/lib/python2.7/site-packages/neutron/plugins/ml2/plugin.py", line 1710, in _safe_update_individual_port_db_status
4145:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api context, port, status, host)
4146:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api File "/usr/lib/python2.7/site-packages/neutron/plugins/ml2/plugin.py", line 1769, in _update_individual_port_db_status
4147:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api levels = db.get_binding_levels(context, port_id, host)
4148:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api File "/usr/lib/python2.7/site-packages/oslo_db/sqlalchemy/enginefacade.py", line 979, in wrapper
4149:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api return fn(*args, **kwargs)
4150:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api File "/usr/lib/python2.7/site-packages/neutron/plugins/ml2/db.py", line 100, in get_binding_levels
4151:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api order_by(models.PortBindingLevel.level).
4152:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/query.py", line 2703, in all
4153:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api return list(self)
4154:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/query.py", line 2854, in __iter__
4155:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api self.session._autoflush()
4156:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/session.py", line 1397, in _autoflush
4157:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api self.flush()
4158:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/session.py", line 2171, in flush
4159:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api self._flush(objects)
4160:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/session.py", line 2291, in _flush
4161:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api transaction.rollback(_capture_exception=True)
4162:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api File "/usr/lib64/python2.7/site-packages/sqlalchemy/util/langhelpers.py", line 66, in __exit__
4163:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api compat.reraise(exc_type, exc_value, exc_tb)
4164:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/session.py", line 2255, in _flush
4165:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api flush_context.execute()
4166:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/unitofwork.py", line 389, in execute
4167:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api rec.execute(self)
4168:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/unitofwork.py", line 548, in execute
4169:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api uow
4170:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/persistence.py", line 177, in save_obj
4171:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api mapper, table, update)
4172:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/persistence.py", line 760, in _emit_update_statements
4173:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api (table.description, len(records), rows))
4174:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api StaleDataError: UPDATE statement on table 'standardattributes' expected to update 1 row(s); 0 were matched.
4175:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api
4176:2017-09-19 17:10:15.216 10600 ERROR neutron.plugins.ml2.rpc [req-3a38738f-efbf-45b0-ae65-3af84f6aae2c - - - - -] Failed to update device 4bd345c4-c8bd-43ff-9525-dbf3e4656c40 up: StaleDataError: UPDATE statement on table 'standardattributes' expected to update 1 row(s); 0 were matched.

Tags: db
Zachary Ma (mazengxie)
tags: added: db
Revision history for this message
Zachary Ma (mazengxie) wrote :

[root@10e131e73e121 ~]# grep -rin "standardattribute" /var/log/neutron/server.log
83922:2017-12-15 10:07:05.128 65701 DEBUG neutron.db.api [req-fe4da6c8-763f-41fa-9dac-537863c11740 - - - - -] Retry wrapper got retriable exception: UPDATE statement on table 'standardattributes' expected to update 1 row(s); 0 were matched. wrapped /usr/lib/python2.7/site-packages/neutron/db/api.py:128
84013:2017-12-15 10:07:05.325 65703 DEBUG neutron.db.api [req-2632a1a7-f65e-45d9-be7b-ab4ce3c7372b - - - - -] Retry wrapper got retriable exception: UPDATE statement on table 'standardattributes' expected to update 1 row(s); 0 were matched. wrapped /usr/lib/python2.7/site-packages/neutron/db/api.py:128

description: updated
Revision history for this message
Slawek Kaplonski (slaweq) wrote :

Can You provide some more details about:
* what You was doing when it happend?
* what version of OpenStack You are using?
* what Neutron configuration do You have?

Revision history for this message
Zachary Ma (mazengxie) wrote :

@Slawek Kaplonski (slaweq)

1.I don't know how to recur problem. I just create vm and attach network.

2.pike

3.use ml2/ovs, no special configuration

Revision history for this message
Lujin Luo (luo-lujin) wrote :

Did you see anything fail?, i.e. can you still attach a network to the created VM and do ping?

I had some experience working with standardattributes. It looks probably that you had some race conditions, say when one thread updated the resource in the middle of another thread trying to do other update to this resource. Then the second one fails because the revision_number does not match, because we use compare-and-swap mechanism.

Revision history for this message
Lujin Luo (luo-lujin) wrote :

Also, when I asked if you see anything actually failed, I am asking failures like being unable to create network.

Revision history for this message
Zachary Ma (mazengxie) wrote :

@Lujin Luo yes, I can still create network. Because @_retry_db_errors will try 10 times at most.
But I will occur the "doesn't match" log several days, and I wanna know you how to solve it?

Revision history for this message
Lujin Luo (luo-lujin) wrote :

We still need to know after executing which commands, you started to see such error logs. Without knowing that, there is no way we can tell for sure. As I said, it may be a potential bug, or it may be what's expected as this is how we avoid race conditions.

Revision history for this message
Zachary Ma (mazengxie) wrote :

Then the second one fails because the revision_number does not match, because we use compare-and-swap mechanism.

sorry, I do not understand why the revision_number does not match ? I saw the "standardattributes" from neutron db.

Revision history for this message
Zachary Ma (mazengxie) wrote :
Download full text (25.5 KiB)

[root@10e131e73e121 ~]# grep -rin "standardattri" /var/log/neutron/server.log
87580:2017-12-19 10:08:57.725 33834 DEBUG neutron.db.api [req-b6dec2c3-cc8a-4f0a-b8d8-aecac3d205c3 - - - - -] Retry wrapper got retriable exception: UPDATE statement on table 'standardattributes' expected to update 1 row(s); 0 were matched. wrapped /usr/lib/python2.7/site-packages/neutron/db/api.py:128
165658:2017-12-19 15:52:53.345 33832 DEBUG neutron.db.api [req-4846cd7b-94f2-4334-9324-df87a390d862 - - - - -] Retry wrapper got retriable exception: UPDATE statement on table 'standardattrbutes' expected to update 1 row(s); 0 were matched. wrapped /usr/lib/python2.7/site-packages/neutron/db/api.py:128

the req:
req-b6dec2c3-cc8a-4f0a-b8d8-aecac3d205c3

[root@10e131e73e121 ~]# grep -rin "req-b6dec2c3-cc8a-4f0a-b8d8-aecac3d205c3" /var/log/neutron/server.log
87202:2017-12-19 10:08:56.189 33834 DEBUG neutron.db.l3_hamode_db [req-b6dec2c3-cc8a-4f0a-b8d8-aecac3d205c3 - - - - -] neutron.services.l3_router.l3_router_plugin.L3RouterPlugin method get_ha_sync_data_for_host called with arguments (<neutron_lib.context.Context object at 0x8cb7590>, u'10e131e73e28', <neutron.db.models.agent.Agent[object at 77417d0] {id=u'66cb0a91-18c4-49bf-99ee-50f0787fb023', agent_type=u'L3 agent', binary=u'neutron-l3-agent', topic=u'l3_agent', host=u'10e131e73e28', availability_zone=u'nova', admin_state_up=True, created_at=datetime.datetime(2017, 11, 3, 7, 45, 27), started_at=datetime.datetime(2017, 12, 17, 12, 30, 9), heartbeat_timestamp=datetime.datetime(2017, 12, 19, 2, 8, 43), description=None, configurations=u'{"agent_mode": "dvr", "gateway_external_network_id": "", "handle_internal_only_routers": true, "routers": 11, "interfaces": 17, "floating_ips": 3, "interface_driver": "openvswitch", "log_agent_heartbeats": false, "external_network_bridge": "", "ex_gw_ports": 11}', resource_versions=None, load=0}>) {'active': True, 'router_ids': [u'7513735a-67d0-42e5-9cfd-edbbcccec0e9']} wrapper /usr/lib/python2.7/site-packages/oslo_log/helpers.py:66
87203:2017-12-19 10:08:56.189 33834 DEBUG neutron.db.l3_dvr_db [req-b6dec2c3-cc8a-4f0a-b8d8-aecac3d205c3 - - - - -] neutron.services.l3_router.l3_router_plugin.L3RouterPlugin method _get_dvr_sync_data called with arguments (<neutron_lib.context.Context object at 0x8cb7590>, u'10e131e73e28', <neutron.db.models.agent.Agent[object at 77417d0] {id=u'66cb0a91-18c4-49bf-99ee-50f0787fb023', agent_type=u'L3 agent', binary=u'neutron-l3-agent', topic=u'l3_agent', host=u'10e131e73e28', availability_zone=u'nova', admin_state_up=True, created_at=datetime.datetime(2017, 11, 3, 7, 45, 27), started_at=datetime.datetime(2017, 12, 17, 12, 30, 9), heartbeat_timestamp=datetime.datetime(2017, 12, 19, 2, 8, 43), description=None, configurations=u'{"agent_mode": "dvr", "gateway_external_network_id": "", "handle_internal_only_routers": true, "routers": 11, "interfaces": 17, "floating_ips": 3, "interface_driver": "openvswitch", "log_agent_heartbeats": false, "external_network_bridge": "", "ex_gw_ports": 11}', resource_versions=None, load=0}>, [u'7513735a-67d0-42e5-9cfd-edbbcccec0e9'], True) {} wrapper /usr/lib/python2.7/site-packages/oslo_log/helpers.py:66
87205:2017-12-19 10:08:...

Revision history for this message
Zachary Ma (mazengxie) wrote :

it may be a potential bug

Revision history for this message
shihanzhang (shihanzhang) wrote :

i oslo have the same problem, in dvr envirentment, sometimes it will lost ovs flow for l2 pop on compute node

Revision history for this message
Zachary Ma (mazengxie) wrote :

@shihanzhang, we meet the same problem that it lost ovs flow for in dvr envirentment sometimes.
It is very bad for our china telecom public cloud.
Do you find the reason ?

Zachary Ma (mazengxie)
Changed in neutron:
assignee: nobody → Zachary Ma (mazengxie)
Revision history for this message
shihanzhang (shihanzhang) wrote :

@Zachary Ma, it is caused by this patch:https://review.openstack.org/#/c/503517/

Revision history for this message
Zachary Ma (mazengxie) wrote :

yes, I know the patch and it can fix the some problem. Now it never occur the following error logs when merged in September in our public cloud.

4124:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api [req-3a38738f-efbf-45b0-ae65-3af84f6aae2c - - - - -] DB exceeded retry limit.: StaleDataError: UPDATE statement on table 'standardattributes' expected to update 1 row(s); 0 were matched.
4125:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api Traceback (most recent call last):
4126:2017-09-19 17:10:15.211 10600 ERROR oslo_db.api File "/usr/lib/python2.7/site-packages/oslo_db/api.py", line 138, in wrapper

But, now, it sometimes reports the following debug logs when created dvr router and bound network.
I am not sure whether it is a potential bug or not.
Because it large probability occur race conditions.

[root@10e131e73e121 ~]# grep -rin "standardattribute" /var/log/neutron/server.log
83922:2017-12-15 10:07:05.128 65701 DEBUG neutron.db.api [req-fe4da6c8-763f-41fa-9dac-537863c11740 - - - - -] Retry wrapper got retriable exception: UPDATE statement on table 'standardattributes' expected to update 1 row(s); 0 were matched. wrapped /usr/lib/python2.7/site-packages/neutron/db/api.py:128
84013:2017-12-15 10:07:05.325 65703 DEBUG neutron.db.api [req-2632a1a7-f65e-45d9-be7b-ab4ce3c7372b - - - - -] Retry wrapper got retriable exception: UPDATE statement on table 'standardattributes' expected to update 1 row(s); 0 were matched. wrapped /usr/lib/python2.7/site-packages/neutron/db/api.py:128

Revision history for this message
Ann Taraday (akamyshnikova) wrote :

>>3.use ml2/ovs, no special configuration
You use DVR as I see, do you use L3 HA as well? They are special configurations actually.
I still missing some information from the report:
You create a network and a vm - then something happened (what do you notice?) and in logs there were these errors.

Changed in neutron:
importance: Undecided → Low
Revision history for this message
Zachary Ma (mazengxie) wrote :

@Ann Taraday never use L3 HA

Now, there are no error logs, but debug logs about "standardattribute" when we merged the patch
https://review.openstack.org/#/c/503517/ in our environment.

[root@10e131e73e121 ~]# grep -rin "standardattribute" /var/log/neutron/server.log
83922:2017-12-15 10:07:05.128 65701 DEBUG neutron.db.api [req-fe4da6c8-763f-41fa-9dac-537863c11740 - - - - -] Retry wrapper got retriable exception: UPDATE statement on table 'standardattributes' expected to update 1 row(s); 0 were matched. wrapped /usr/lib/python2.7/site-packages/neutron/db/api.py:128
84013:2017-12-15 10:07:05.325 65703 DEBUG neutron.db.api [req-2632a1a7-f65e-45d9-be7b-ab4ce3c7372b - - - - -] Retry wrapper got retriable exception: UPDATE statement on table 'standardattributes' expected to update 1 row(s); 0 were matched. wrapped /usr/lib/python2.7/site-packages/neutron/db/api.py:128

Revision history for this message
Zachary Ma (mazengxie) wrote :

176302 2017-12-26 18:53:25.187 11925 DEBUG neutron.plugins.ml2.rpc [req-37d49036-2d6d-4548-9bed-2fa880fcd77a - - - - -] Update DVR po rt a8f7751a-0564-4e11-9dc5-553b732c51a7 at host 10e131e78e34 status to active. update_port_status_to_active /usr/lib/python2.7 /site-packages/neutron/plugins/ml2/rpc.py:278
176303 2017-12-26 18:53:25.194 11921 DEBUG neutron.wsgi [-] (11921) accepted ('10.131.78.148', 60032) server /usr/lib/python2.7/site- packages/eventlet/wsgi.py:883
176304 2017-12-26 18:53:25.196 11921 INFO neutron.wsgi [-] 10.131.78.148 "OPTIONS / HTTP/1.0" status: 200 len: 251 time: 0.0011199
176305 2017-12-26 18:53:25.228 11925 DEBUG neutron_lib.callbacks.manager [req-37d49036-2d6d-4548-9bed-2fa880fcd77a - - - - -] Notify callbacks [] for port, before_update _notify_loop /usr/lib/python2.7/site-packages/neutron_lib/callbacks/manager.py:167
176306 2017-12-26 18:53:25.238 11925 DEBUG neutron.plugins.ml2.plugin [req-37d49036-2d6d-4548-9bed-2fa880fcd77a - - - - -] The update d value of port a8f7751a-0564-4e11-9dc5-553b732c51a7 is True. _update_individual_port_db_status /usr/lib/python2.7/site-packag es/neutron/plugins/ml2/plugin.py:1759
176307 2017-12-26 18:53:25.425 11925 DEBUG neutron.db.api [req-37d49036-2d6d-4548-9bed-2fa880fcd77a - - - - -] Retry wrapper got retr iable exception: UPDATE statement on table 'standardattributes' expected to update 1 row(s); 0 were matched. wrapped /usr/lib/ python2.7/site-packages/neutron/db/api.py:128
176308 2017-12-26 18:53:25.425 11925 DEBUG oslo_db.api [req-37d49036-2d6d-4548-9bed-2fa880fcd77a - - - - -] Performing DB retry for f unction neutron.plugins.ml2.plugin.update_port_statuses wrapper /usr/lib/python2.7/site-packages/oslo_db/api.py:152
176309 2017-12-26 18:53:25.567 11925 DEBUG neutron_lib.callbacks.manager [req-37d49036-2d6d-4548-9bed-2fa880fcd77a - - - - -] Notify callbacks [] for port, before_update _notify_loop /usr/lib/python2.7/site-packages/neutron_lib/callbacks/manager.py:167
176310 2017-12-26 18:53:25.571 11925 DEBUG neutron.plugins.ml2.plugin [req-37d49036-2d6d-4548-9bed-2fa880fcd77a - - - - -] The update d value of port a8f7751a-0564-4e11-9dc5-553b732c51a7 is False. _update_individual_port_db_status /usr/lib/python2.7/site-packa ges/neutron/plugins/ml2/plugin.py:1759
176311 2017-12-26 18:53:25.817 11925 DEBUG neutron.plugins.ml2.db [req-37d49036-2d6d-4548-9bed-2fa880fcd77a - - - - -] For port a8f77 51a-0564-4e11-9dc5-553b732c51a7, host 10e131e78e34, got binding levels [<neutron.plugins.ml2.models.PortBindingLevel[object at 5d4f510] {port_id=u'a8f7751a-0564-4e11-9dc5-553b732c51a7', host=u'10e131e78e34', level=0, driver=u'openvswitch', segment_id=u '4f9fde68-f2ce-43cc-bab3-097721affb3c'}>] get_binding_levels /usr/lib/python2.7/site-packages/neutron/plugins/ml2/db.py:106

Like this log,
the function _update_individual_port_db_status sometimes cause neutron.plugins.ml2.plugin.update_port_statuses performing DB retry, but the port.status has been chanaged.
So, it will be caused the ovs flow miss.

summary: [neutron-db] neutron-server report error "UPDATE statement on table
- 'standardattributes' expected to update 1 row(s); 0 were matched'
+ 'standardattributes' expected to update 1 row(s); 0 were matched'. It
+ will be caused the ovs flow miss
Revision history for this message
Ryan Tidwell (ryan-tidwell) wrote :
Download full text (5.8 KiB)

I am also observing this behavior. It occurs when attempting to live-migrate a handful of VM's in rapid succession to the same host. This quick and dirty script is able to make the issue appear:

for i in `openstack server list --all-projects --host <origin> -c ID -f value`; do openstack server migrate $i --live <target>; done

From the neutron server logs:

DB exceeded retry limit.: StaleDataError: UPDATE statement on table 'standardattributes' expected to update 1 row(s); 0 were matched.
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api Traceback (most recent call last):
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api File "/opt/stack/venv/neutron-20181030T130300Z/lib/python2.7/site-packages/oslo_db/api.py", line 138, in wrapper
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api return f(*args, **kwargs)
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api File "/opt/stack/venv/neutron-20181030T130300Z/lib/python2.7/site-packages/neutron/db/api.py", line 128, in wrapped
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api LOG.debug("Retry wrapper got retriable exception: %s", e)
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api File "/opt/stack/venv/neutron-20181030T130300Z/lib/python2.7/site-packages/oslo_utils/excutils.py", line 220, in __exit__
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api self.force_reraise()
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api File "/opt/stack/venv/neutron-20181030T130300Z/lib/python2.7/site-packages/oslo_utils/excutils.py", line 196, in force_reraise
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api six.reraise(self.type_, self.value, self.tb)
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api File "/opt/stack/venv/neutron-20181030T130300Z/lib/python2.7/site-packages/neutron/db/api.py", line 124, in wrapped
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api return f(*dup_args, **dup_kwargs)
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api File "/opt/stack/venv/neutron-20181030T130300Z/lib/python2.7/site-packages/neutron/plugins/ml2/plugin.py", line 1346, in update_port
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api mech_context, attrs)
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api File "/opt/stack/venv/neutron-20181030T130300Z/lib/python2.7/site-packages/neutron/plugins/ml2/plugin.py", line 354, in _process_port_binding
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api db.clear_binding_levels(plugin_context, port_id, original_host)
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api File "/opt/stack/venv/neutron-20181030T130300Z/lib/python2.7/site-packages/oslo_db/sqlalchemy/enginefacade.py", line 979, in wrapper
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api return fn(*args, **kwargs)
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api File "/usr/lib64/python2.7/contextlib.py", line 24, in __exit__
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api self.gen.next()
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api File "/opt/stack/venv/neutron-20181030T130300Z/lib/python2.7/site-packages/oslo_db/sqlalchemy/enginefacade.py", line 1029, in _transaction_scope
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api yield resource
2019-01-24 09:57:34.959 255478 ERROR oslo_d...

Read more...

Revision history for this message
Ryan Tidwell (ryan-tidwell) wrote :
To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.