The global amount of Octavia loadbalancers is constrained by the service project quotas
Affects | Status | Importance | Assigned to | Milestone | |
---|---|---|---|---|---|
tripleo |
Fix Released
|
Medium
|
Brent Eagles |
Bug Description
First discovered in https:/
Octavia creates Amphorae (service VMs) under an operator configured project (tenant). In TripleO, we currently use 'service' project by default.
Since booting Amphorae consume the project quota, this essentially result a very low (around 10) global upper constraint for loadbalancers amount.
This is in opposed to how quotas normally being used in openstack.
To conclude:
When user 'test' creates a loadbalancer in project 'xyz':
1. Loadbalancer related quota is being consumed for project
xyz' (expected).
2. ports, cores, instances, ram and security groups quota is being consumed for project 'service', which will eventually prevent users from *any* project to create loadbalancers. Even if they did not fully consume their loadbalancer related project quota.
To fix this:
We need to address the 'service' project as a system project. Thus, we cannot be limited by project quotas for Octavia VMs.
We need to set '-1' for the following quotas (in the 'service project only):
1. ports
2. cores
3. instances
4. ram
5. security groups
Changed in tripleo: | |
importance: | Undecided → Medium |
milestone: | none → rocky-2 |
Changed in tripleo: | |
assignee: | Brent Eagles (beagles) → Carlos Goncalves (cgoncalves) |
Changed in tripleo: | |
assignee: | Carlos Goncalves (cgoncalves) → Brent Eagles (beagles) |
2018-05-08 12:45:40.090 21 ERROR oslo_messaging. rpc.server [-] Exception during message handling: OverQuotaClient: Quota exceeded for resources: ['security_group']. e8ff-4b59- 91c7-5c8e21c5b7 6e'] rpc.server Traceback (most recent call last): rpc.server File "/usr/lib/ python2. 7/site- packages/ oslo_messaging/ rpc/server. py", line 163, in _process_incoming rpc.server res = self.dispatcher .dispatch( message) rpc.server File "/usr/lib/ python2. 7/site- packages/ oslo_messaging/ rpc/dispatcher. py", line 220, in dispatch rpc.server return self._do_ dispatch( endpoint, method, ctxt, args) rpc.server File "/usr/lib/ python2. 7/site- packages/ oslo_messaging/ rpc/dispatcher. py", line 190, in _do_dispatch rpc.server result = func(ctxt, **new_args) rpc.server File "/usr/lib/ python2. 7/site- packages/ octavia/ controller/ queue/endpoint. py", line 44, in create_ load_balancer rpc.server self.worker. create_ load_balancer( load_balancer_ id) rpc.server File "/usr/lib/ python2. 7/site- packages/ octavia/ controller/ worker/ controller_ worker. py", line 284, in create_ load_balancer rpc.server create_lb_tf.run() rpc.server File "/usr/lib/ python2. 7/site- packages/ taskflow/ engines/ action_ engine/ engine. py", line 247, in run rpc.server for _state in self.run_ iter(timeout= timeout) : rpc.server File "/usr/lib/ python2. 7/site- packages/ taskflow/ engines/ action_ engine/ engine. py", line 340, in run_iter rpc.server failure. Failure. reraise_ if_any( er_failures) rpc.server File "/usr/lib/ python2. 7/site- packages/ taskflow/ types/failure. py", line 336, in reraise_if_any rpc.server failures[ 0].reraise( ) rpc.server File "/usr/lib/ python2. 7/site- packages/ taskflow/ types/failure. py", line 343, in reraise rpc.server six.reraise( *self._ exc_info) rpc.server File "/usr/lib/ python2. 7/site- packages/ taskflow/ engines/ action_ engine/ executor. py", line 53, in _execute_task rpc.server result = task.execute( **arguments) rpc.server File "/usr/lib/ python2. 7/site- packages/ octavia/ controller/ worker/ tasks/network_ tasks.py" , line 278, in execute
Neutron server returns request_ids: ['req-a80c6e5c-
2018-05-08 12:45:40.090 21 ERROR oslo_messaging.
2018-05-08 12:45:40.090 21 ERROR oslo_messaging.
2018-05-08 12:45:40.090 21 ERROR oslo_messaging.
2018-05-08 12:45:40.090 21 ERROR oslo_messaging.
2018-05-08 12:45:40.090 21 ERROR oslo_messaging.
2018-05-08 12:45:40.090 21 ERROR oslo_messaging.
2018-05-08 12:45:40.090 21 ERROR oslo_messaging.
2018-05-08 12:45:40.090 21 ERROR oslo_messaging.
2018-05-08 12:45:40.090 21 ERROR oslo_messaging.
2018-05-08 12:45:40.090 21 ERROR oslo_messaging.
2018-05-08 12:45:40.090 21 ERROR oslo_messaging.
2018-05-08 12:45:40.090 21 ERROR oslo_messaging.
2018-05-08 12:45:40.090 21 ERROR oslo_messaging.
2018-05-08 12:45:40.090 21 ERROR oslo_messaging.
2018-05-08 12:45:40.090 21 ERROR oslo_messaging.
2018-05-08 12:45:40.090 21 ERROR oslo_messaging.
2018-05-08 12:45:40.090 21 ERROR oslo_messaging.
2018-05-08 12:45:40.090 21 ERROR oslo_messaging.
2018-05-08 12:45:40.090 21 ERROR oslo_messaging.
2018-05-08 12:45:40.090 21 ERROR oslo_messaging.
2018-05-08 12:45:40.090 21 ERROR oslo_messaging.
2018-05-08 12:45:40.090 21 ERROR oslo_messaging.
2018-05-08 12:45:40.090 21 ERROR oslo_messaging.r...