curl -s -X POST -H Content-type:application/json -H "X-Auth-Token:$MYTOKEN" -d '{"name": "test_lb", "vip": {"subnet_id": $SUBNET_ID}, "listeners": [{"name": "test_listener", "protocol": "HTTP", "protocol_por
t": 80, "default_pool": {"name":"test_pool1", "protocol": "HTTP", "lb_algorithm": "ROUND_ROBIN", "members": [{"ip_address": "10.32.42.181", "protocol_port": 80}, {"ip_address": "10.32.47.154", "protocol_port": 80}]
}}]}' http://${OCTAVIA_IP}:${OCTAVIA_PORT}/v1/loadbalancers | jq
{
"debuginfo": "Traceback (most recent call last):\n\n File \"/opt/octavia/env/lib/python2.7/site-packages/wsmeext/pecan.py\", line 85, in callfunction\n result = f(self, *args, **kwargs)\n\n File \"/opt/octavia/env/lib/python2.7/site-packages/octavia/api/v1/controllers/load_balancer.py\", line 155, in post\n lock_session.rollback()\n\n File \"/opt/octavia/env/lib/python2.7/site-packages/oslo_utils/excutils.py\", line 220, in __exit__\n self.force_reraise()\n\n File \"/opt/octavia/env/lib/python2.7/site-packages/oslo_utils/excutils.py\", line 196, in force_reraise\n six.reraise(self.type_, self.value, self.tb)\n\n File \"/opt/octavia/env/lib/python2.7/site-packages/octavia/api/v1/controllers/load_balancer.py\", line 151, in post\n load_balancer)\n\n File \"/opt/octavia/env/lib/python2.7/site-packages/octavia/api/v1/controllers/load_balancer.py\", line 84, in _create_load_balancer_graph_db\n lock_session, prepped_lb)\n\n File \"/opt/octavia/env/lib/python2.7/site-packages/octavia/db/repositories.py\", line 610, in create_load_balancer_tree\n lock_session.rollback()\n\n File \"/opt/octavia/env/lib/python2.7/site-packages/oslo_utils/excutils.py\", line 220, in __exit__\n self.force_reraise()\n\n File \"/opt/octavia/env/lib/python2.7/site-packages/oslo_utils/excutils.py\", line 196, in force_reraise\n six.reraise(self.type_, self.value, self.tb)\n\n File \"/opt/octavia/env/lib/python2.7/site-packages/octavia/db/repositories.py\", line 501, in create_load_balancer_tree\n lb_dict['project_id']):\n\n File \"/opt/octavia/env/lib/python2.7/site-packages/octavia/db/repositories.py\", line 400, in check_quota_met\n raise exceptions.ProjectBusyException()\n\nProjectBusyException: Project busy. Unable to lock the project. Please try again.\n",
"faultcode": "Server",
"faultstring": "Project busy. Unable to lock the project. Please try again."
}
===============
octavia-api log
===============
127.0.0.1 - - [14/Feb/2017 20:03:27] "POST /v1/loadbalancers HTTP/1.1" 400 139
2017-02-14 20:03:46.655 24413 DEBUG octavia.db.repositories [req-3d1390e6-fe5a-4e38-bc0b-9cb2cf5640b4 - - - - -] Checking quota for project: e54e5a6636324fdda3f338a96e9fd60c object: <class 'octavia.common.data_models.LoadBalancer'> check_quota_met /opt/octavia/env/lib/python2.7/site-packages/octavia/db/repositories.py:263
2017-02-14 20:03:46.668 24413 DEBUG octavia.db.repositories [req-3d1390e6-fe5a-4e38-bc0b-9cb2cf5640b4 - - - - -] Checking quota for project: e54e5a6636324fdda3f338a96e9fd60c object: <class 'octavia.common.data_models.LoadBalancer'> check_quota_met /opt/octavia/env/lib/python2.7/site-packages/octavia/db/repositories.py:263
2017-02-14 20:04:37.652 24413 WARNING octavia.db.repositories [req-3d1390e6-fe5a-4e38-bc0b-9cb2cf5640b4 - - - - -] Quota project lock timed out for project: e54e5a6636324fdda3f338a96e9fd60c
2017-02-14 20:04:37.658 24413 ERROR wsme.api [req-3d1390e6-fe5a-4e38-bc0b-9cb2cf5640b4 - - - - -] Server-side error: "Project busy. Unable to lock the project. Please try again.". Detail:
Traceback (most recent call last):
File "/opt/octavia/env/lib/python2.7/site-packages/wsmeext/pecan.py", line 85, in callfunction
result = f(self, *args, **kwargs)
File "/opt/octavia/env/lib/python2.7/site-packages/octavia/api/v1/controllers/load_balancer.py", line 155, in post
lock_session.rollback()
File "/opt/octavia/env/lib/python2.7/site-packages/oslo_utils/excutils.py", line 220, in __exit__
self.force_reraise()
File "/opt/octavia/env/lib/python2.7/site-packages/oslo_utils/excutils.py", line 196, in force_reraise
six.reraise(self.type_, self.value, self.tb)
File "/opt/octavia/env/lib/python2.7/site-packages/octavia/api/v1/controllers/load_balancer.py", line 151, in post
load_balancer)
File "/opt/octavia/env/lib/python2.7/site-packages/octavia/api/v1/controllers/load_balancer.py", line 84, in _create_load_balancer_graph_db
lock_session, prepped_lb)
File "/opt/octavia/env/lib/python2.7/site-packages/octavia/db/repositories.py", line 610, in create_load_balancer_tree
lock_session.rollback()
File "/opt/octavia/env/lib/python2.7/site-packages/oslo_utils/excutils.py", line 220, in __exit__
self.force_reraise()
File "/opt/octavia/env/lib/python2.7/site-packages/oslo_utils/excutils.py", line 196, in force_reraise
six.reraise(self.type_, self.value, self.tb)
File "/opt/octavia/env/lib/python2.7/site-packages/octavia/db/repositories.py", line 501, in create_load_balancer_tree
lb_dict['project_id']):
File "/opt/octavia/env/lib/python2.7/site-packages/octavia/db/repositories.py", line 400, in check_quota_met
raise exceptions.ProjectBusyException()
ProjectBusyException: Project busy. Unable to lock the project. Please try again.
127.0.0.1 - - [14/Feb/2017 20:04:37] "POST /v1/loadbalancers HTTP/1.1" 503 1908
I can't reproduce this on my devstack.
I ran it this way: curl -s -X POST -H Content- type:applicatio n/json -d '{"name": "test_lb", "project_id": "cf418ad823834c 28b2ad3888d4b01 fa8", "vip": {"subnet_id": "022f1424- 6306-42fe- 89de-204578eddf 6f"}, "listeners": [{"name": "test_listener", "protocol": "HTTP", "protocol_port": 80, "default_pool": {"name" :"test_ pool1", "protocol": "HTTP", "lb_algorithm": "ROUND_ROBIN", "members": [{"ip_address": "172.21.1.15", "protocol_port": 80}, {"ip_address": "172.21.2.12", "protocol_port": 80}]}}]}' http:// 172.21. 21.135: 9876/v1/ loadbalancers
I got the expected 202 and the LB started up functional.
Was there another/different failure before this? Were there multiple requests going in at the same time? It waits to acquire the project quota row lock for a minute before giving up, which should be more than enough time for other requests to clear their locks(happens in less than a second usually).
Were you running master?