lbaas:after create 375 LB pool , the new lb -pool and vip get in error status

Bug #1498359 reported by spark
6
This bug affects 1 person
Affects Status Importance Assigned to Milestone
octavia
Expired
High
Unassigned

Bug Description

1 create two-arm LB with 1client and 1 backend server on a tenant
2 repeat step1 to create 375 tenants
3 after step 2 , the LB network unstable

[root@nsj13 ~(keystone_admin)]# neutron lb-vip-list |grep ERROR
| 054bc376-ff50-40cb-b003-831569b41f0b | Scale1_vip_420 | 20.1.1.100 | HTTP | True | ERROR |
| 06fcb386-4563-48d1-b6de-003967a97a54 | Scale1_vip_418 | 20.1.1.100 | HTTP | True | ERROR |
| 160f9c87-5cb1-4a5e-9829-cc5de30114c6 | Scale1_vip_444 | 20.1.1.100 | HTTP | True | ERROR |
| 1f49c1ca-0dc9-4f6a-9efd-2c741fcc6ff3 | Scale1_vip_380 | 20.1.1.100 | HTTP | True | ERROR |
| 25040c46-446d-4e3f-841c-da9166bb4ad0 | Scale1_vip_384 | 20.1.1.100 | HTTP | True | ERROR |
| 2690bbdb-30e3-4c6c-b346-f96be83ea745 | Scale1_vip_419 | 20.1.1.100 | HTTP | True | ERROR |
| 27e9fbb4-a3c8-4084-af11-c3e71af1a762 | Scale1_vip_431 | 20.1.1.100 | HTTP | True | ERROR |
| 283aea00-ba23-403d-a4cd-0d275c36c53f | Scale1_vip_375 | 20.1.1.100 | HTTP | True | ERROR |
| 28dbb59d-28a9-4af2-b02a-9f953d4c9dd3 | Scale1_vip_428 | 20.1.1.100 | HTTP | True | ERROR |
| 30e99e31-2739-4d1b-a05e-f67d41f26c14 | Scale1_vip_430 | 20.1.1.100 | HTTP | True | ERROR |
| 37ee787f-d684-4968-95ea-15666b0cd0e7 | Scale1_vip_410 | 20.1.1.100 | HTTP | True | ERROR |
| 38fe0379-2c5a-4bcc-9aab-c0f75466b5bd | Scale1_vip_408 | 20.1.1.100 | HTTP | True | ERROR |
| 4190852c-0327-4b05-b1fd-20ef62cf06c0 | Scale1_vip_450 | 20.1.1.100 | HTTP | True | ERROR |
| 461a6da3-07e3-47d3-9d47-18ba2add0692 | Scale1_vip_445 | 20.1.1.100 | HTTP | True | ERROR |
| 49fb200a-9a73-4d93-a2ef-a5d033377ee2 | Scale1_vip_378 | 20.1.1.100 | HTTP | True | ERROR |
| 556a7e3a-d922-43c8-b584-416a8689421d | Scale1_vip_383 | 20.1.1.100 | HTTP | True | ERROR |
| 55a500b0-682d-4fc6-93e3-588dfc590b35 | Scale1_vip_421 | 20.1.1.100 | HTTP | True | ERROR |
| 5cb8347d-fbcc-497e-ba47-244d5b3f394b | Scale1_vip_429 | 20.1.1.100 | HTTP | True | ERROR |
| 606380c4-fd1e-43a6-b67d-01a01c7ad327 | Scale1_vip_412 | 20.1.1.100 | HTTP | True | ERROR |
| 60ff74a8-c464-49d0-9f8c-143c7629201e | Scale1_vip_407 | 20.1.1.100 | HTTP | True | ERROR |
| 62ce8e28-0c8f-41e1-831d-5ef7500dbb1f | Scale1_vip_453 | 20.1.1.100 | HTTP | True | ERROR |
| 691ab6a0-c06e-4e6a-9c7d-b70fc72146e0 | Scale1_vip_436 | 20.1.1.100 | HTTP | True | ERROR |
| 6f5c12d8-3640-45b5-a600-19f522227d2c | Scale1_vip_427 | 20.1.1.100 | HTTP | True | ERROR |
| 746ab955-20da-41cc-9830-9c33e41abc9c | Scale1_vip_415 | 20.1.1.100 | HTTP | True | ERROR |
| 75342fd8-99bd-4527-92f5-7f3d91193adb | Scale1_vip_414 | 20.1.1.100 | HTTP | True | ERROR |
| 851311f3-d84d-43e4-9bc2-6fa08c5d4e7e | Scale1_vip_416 | 20.1.1.100 | HTTP | True | ERROR |
| 8624e33d-bc23-4b78-bde2-ceb2f12e6b65 | Scale1_vip_417 | 20.1.1.100 | HTTP | True | ERROR |
| 895e8ee6-1e78-44e5-a542-3713e07f49cb | Scale1_vip_377 | 20.1.1.100 | HTTP | True | ERROR |
| 94855fcf-03fc-431a-b6bb-58abc0f2a084 | Scale1_vip_437 | 20.1.1.100 | HTTP | True | ERROR |
| 9af109fa-f301-4942-a559-f6ad75459f3d | Scale1_vip_448 | 20.1.1.100 | HTTP | True | ERROR |
| 9cf4bfa3-0073-4a46-a437-2f960ac41d34 | Scale1_vip_422 | 20.1.1.100 | HTTP | True | ERROR |
| a138b439-c8b8-459f-85b6-3193d9df5e6d | Scale1_vip_433 | 20.1.1.100 | HTTP | True | ERROR |
| a49b0dd1-55ce-4b52-8024-ce8f3219feab | Scale1_vip_435 | 20.1.1.100 | HTTP | True | ERROR |
| ae57eadd-3a2d-457c-9149-014d43b2a91b | Scale1_vip_374 | 20.1.1.100 | HTTP | True | ERROR |
| b70e7123-3e92-4ad1-bdf3-da58d5ed1a9f | Scale1_vip_426 | 20.1.1.100 | HTTP | True | ERROR |
| b79eef81-8631-4012-97ed-b68cf7f5706b | Scale1_vip_434 | 20.1.1.100 | HTTP | True | ERROR |
| bffecb85-1992-4c54-a411-8b8064c5b6ec | Scale1_vip_405 | 20.1.1.100 | HTTP | True | ERROR |
| c51a6ed5-3134-412c-aad0-320ed7e7aaa1 | Scale1_vip_373 | 20.1.1.100 | HTTP | True | ERROR |
| c6b8bb0f-eeca-4f2a-b9be-6162017daa5c | Scale1_vip_424 | 20.1.1.100 | HTTP | True | ERROR |
| c8153eaa-fc15-4af4-91ce-eb26c8ff3478 | Scale1_vip_432 | 20.1.1.100 | HTTP | True | ERROR |
| c883db3b-675f-443f-8295-6bc4ec35364f | Scale1_vip_423 | 20.1.1.100 | HTTP | True | ERROR |
| dec84569-902a-46e4-99ae-71698e9294c0 | Scale1_vip_406 | 20.1.1.100 | HTTP | True | ERROR |
| ec0e21a5-4a3a-4ba2-9682-d71234f0aaa4 | Scale1_vip_381 | 20.1.1.100 | HTTP | True | ERROR |
| ecbc2bc0-2c43-46a0-86af-e1dc13264131 | Scale1_vip_404 | 20.1.1.100 | HTTP | True | ERROR |
| f1149949-8ca3-4f13-8f7b-d7709743e832 | Scale1_vip_413 | 20.1.1.100 | HTTP | True | ERROR |
| f2b4a764-ca53-4d15-9f27-db3bb4852fce | Scale1_vip_382 | 20.1.1.100 | HTTP | True | ERROR |
| f2d20652-88ef-463f-859e-401987c40916 | Scale1_vip_379 | 20.1.1.100 | HTTP | True | ERROR |
| fc62d8bd-82ec-43c6-9bf6-351f5442bea0 | Scale1_vip_446 | 20.1.1.100 | HTTP | True | ERROR |
| ff435ed0-20f7-4331-bf4c-0b46f9596f56 | Scale1_vip_425 | 20.1.1.100 | HTTP | True | ERROR |
[root@nsj13 ~(keystone_admin)]# neutron lb-pool-list |grep ERROR
| 36253c92-8069-4543-bf56-c9a381ba7c1a | pool_462 | haproxy | ROUND_ROBIN | HTTP | True | ERROR |
| 4c252c93-30a0-4495-a1fe-f2e218b2c71e | pool_452 | haproxy | ROUND_ROBIN | HTTP | True | ERROR |
| 5a593359-ff14-45e1-a71e-153010847b08 | pool_461 | haproxy | ROUND_ROBIN | HTTP | True | ERROR |
| 63e982cd-05a1-4f39-a3d6-06130cb5cd91 | pool_455 | haproxy | ROUND_ROBIN | HTTP | True | ERROR |
| 8268f1db-b318-4a93-ab32-8a6dc872792f | pool_460 | haproxy | ROUND_ROBIN | HTTP | True | ERROR |
| 87f48c5e-962b-434f-ba68-f0520d22b3d1 | pool_454 | haproxy | ROUND_ROBIN | HTTP | True | ERROR |
| 8b328094-53e0-4cf4-bb69-90078c61c0fa | pool_457 | haproxy | ROUND_ROBIN | HTTP | True | ERROR |
| 970f73db-abe1-4aec-8189-d1326801fdd7 | pool_456 | haproxy | ROUND_ROBIN | HTTP | True | ERROR |
| a8d4051d-795b-4c95-b330-fe2cfcffc6bc | pool_449 | haproxy | ROUND_ROBIN | HTTP | True | ERROR |
| bc3429b4-0b57-4b2e-a737-774ca066ae72 | pool_463 | haproxy | ROUND_ROBIN | HTTP | True | ERROR |
| bc8c023a-64e3-441a-aed9-fca4296fc876 | pool_459 | haproxy | ROUND_ROBIN | HTTP | True | ERROR |
| e7339e01-093f-45ba-a1a2-8da33289a494 | pool_451 | haproxy | ROUND_ROBIN | HTTP | True | ERROR |
| e8a88420-68f5-46ef-a95d-920e6269342e | pool_458 | haproxy | ROUND_ROBIN | HTTP | True | ERROR |
| fc65aef1-b684-4a17-b1ff-792d25e1145e | pool_447 | haproxy | ROUND_ROBIN | HTTP | True | ERROR |
[root@nsj13 ~(keystone_admin)]# openstack-status

log
[root@nsj4 ~]# tail -n 100 /var/log/neutron/lbaas-agent.log
2015-08-28 05:43:35.765 40095 TRACE neutron.services.loadbalancer.agent.agent_manager File "/usr/lib/python2.7/site-packages/neutron/services/loadbalancer/agent/agent_api.py", line 71, in plug_vip_port
2015-08-28 05:43:35.765 40095 TRACE neutron.services.loadbalancer.agent.agent_manager self.make_msg('plug_vip_port', port_id=port_id, host=self.host)
2015-08-28 05:43:35.765 40095 TRACE neutron.services.loadbalancer.agent.agent_manager File "/usr/lib/python2.7/site-packages/neutron/common/log.py", line 34, in wrapper
2015-08-28 05:43:35.765 40095 TRACE neutron.services.loadbalancer.agent.agent_manager return method(*args, **kwargs)
2015-08-28 05:43:35.765 40095 TRACE neutron.services.loadbalancer.agent.agent_manager File "/usr/lib/python2.7/site-packages/neutron/common/rpc.py", line 161, in call
2015-08-28 05:43:35.765 40095 TRACE neutron.services.loadbalancer.agent.agent_manager context, msg, rpc_method='call', **kwargs)
2015-08-28 05:43:35.765 40095 TRACE neutron.services.loadbalancer.agent.agent_manager File "/usr/lib/python2.7/site-packages/neutron/common/rpc.py", line 187, in __call_rpc_method
2015-08-28 05:43:35.765 40095 TRACE neutron.services.loadbalancer.agent.agent_manager return func(context, msg['method'], **msg['args'])
2015-08-28 05:43:35.765 40095 TRACE neutron.services.loadbalancer.agent.agent_manager File "/usr/lib/python2.7/site-packages/oslo/messaging/rpc/client.py", line 389, in call
2015-08-28 05:43:35.765 40095 TRACE neutron.services.loadbalancer.agent.agent_manager return self.prepare().call(ctxt, method, **kwargs)
2015-08-28 05:43:35.765 40095 TRACE neutron.services.loadbalancer.agent.agent_manager File "/usr/lib/python2.7/site-packages/oslo/messaging/rpc/client.py", line 152, in call
2015-08-28 05:43:35.765 40095 TRACE neutron.services.loadbalancer.agent.agent_manager retry=self.retry)
2015-08-28 05:43:35.765 40095 TRACE neutron.services.loadbalancer.agent.agent_manager File "/usr/lib/python2.7/site-packages/oslo/messaging/transport.py", line 90, in _send
2015-08-28 05:43:35.765 40095 TRACE neutron.services.loadbalancer.agent.agent_manager timeout=timeout, retry=retry)
2015-08-28 05:43:35.765 40095 TRACE neutron.services.loadbalancer.agent.agent_manager File "/usr/lib/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py", line 408, in send
2015-08-28 05:43:35.765 40095 TRACE neutron.services.loadbalancer.agent.agent_manager retry=retry)
2015-08-28 05:43:35.765 40095 TRACE neutron.services.loadbalancer.agent.agent_manager File "/usr/lib/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py", line 397, in _send
2015-08-28 05:43:35.765 40095 TRACE neutron.services.loadbalancer.agent.agent_manager result = self._waiter.wait(msg_id, timeout)
2015-08-28 05:43:35.765 40095 TRACE neutron.services.loadbalancer.agent.agent_manager File "/usr/lib/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py", line 285, in wait
2015-08-28 05:43:35.765 40095 TRACE neutron.services.loadbalancer.agent.agent_manager reply, ending = self._poll_connection(msg_id, timeout)
2015-08-28 05:43:35.765 40095 TRACE neutron.services.loadbalancer.agent.agent_manager File "/usr/lib/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py", line 235, in _poll_connection
2015-08-28 05:43:35.765 40095 TRACE neutron.services.loadbalancer.agent.agent_manager % msg_id)
2015-08-28 05:43:35.765 40095 TRACE neutron.services.loadbalancer.agent.agent_manager MessagingTimeout: Timed out waiting for a reply to message ID d66dca467e69496595d4464815bbf6e9
2015-08-28 05:43:35.765 40095 TRACE neutron.services.loadbalancer.agent.agent_manager
2015-08-28 05:44:37.762 40095 WARNING oslo.messaging._drivers.amqpdriver [req-a2a28e4a-9e09-426b-b176-c3c5f16be16d ] No calling threads waiting for msg_id : f02c140b867e4b1c998e7447c2e3d380, message : {u'_unique_id': u'c023475cefc6477db675ebb7d64727a4', u'failure': None, u'result': None}
2015-08-28 05:44:37.765 40095 WARNING oslo.messaging._drivers.amqpdriver [req-a2a28e4a-9e09-426b-b176-c3c5f16be16d ] _queues: {'5afbf8f442dc495d8290298e9c06bc9e': <Queue at 0x369db10 maxsize=None tasks=1 _cond=<Event at 0x372ab90 result=NOT_USED _exc=None _waiters[0]>>}
2015-08-28 05:44:37.834 40095 WARNING oslo.messaging._drivers.amqpdriver [req-a2a28e4a-9e09-426b-b176-c3c5f16be16d ] No calling threads waiting for msg_id : f02c140b867e4b1c998e7447c2e3d380, message : {u'_unique_id': u'af5e3db8efe04c43b41c819e11bf1880', u'failure': None, u'result': None, u'ending': True}
2015-08-28 05:44:37.835 40095 WARNING oslo.messaging._drivers.amqpdriver [req-a2a28e4a-9e09-426b-b176-c3c5f16be16d ] _queues: {'5afbf8f442dc495d8290298e9c06bc9e': <Queue at 0x369db10 maxsize=None tasks=1 _cond=<Event at 0x372ab90 result=NOT_USED _exc=None _waiters[0]>>}
2015-08-28 05:44:37.860 40095 WARNING oslo.messaging._drivers.amqpdriver [req-a2a28e4a-9e09-426b-b176-c3c5f16be16d ] No calling threads waiting for msg_id : 006fe7ee3eb24367aa8100dda1814910, message : {u'_unique_id': u'035e3f0ce8f24bd6a53671e4116a0b40', u'failure': None, u'result': None}
2015-08-28 05:44:37.862 40095 WARNING oslo.messaging._drivers.amqpdriver [req-a2a28e4a-9e09-426b-b176-c3c5f16be16d ] _queues: {'5afbf8f442dc495d8290298e9c06bc9e': <Queue at 0x369db10 maxsize=None tasks=1 _cond=<Event at 0x372ab90 result=NOT_USED _exc=None _waiters[0]>>}
2015-08-28 05:44:37.890 40095 WARNING oslo.messaging._drivers.amqpdriver [req-a2a28e4a-9e09-426b-b176-c3c5f16be16d ] No calling threads waiting for msg_id : 006fe7ee3eb24367aa8100dda1814910, message : {u'_unique_id': u'8707a91bbc2144e1955c96ab75c0468b', u'failure': None, u'result': None, u'ending': True}
2015-08-28 05:44:37.891 40095 WARNING oslo.messaging._drivers.amqpdriver [req-a2a28e4a-9e09-426b-b176-c3c5f16be16d ] _queues: {'5afbf8f442dc495d8290298e9c06bc9e': <Queue at 0x369db10 maxsize=None tasks=1 _cond=<Event at 0x372ab90 result=NOT_USED _exc=None _waiters[0]>>}
2015-08-28 05:44:38.044 40095 WARNING oslo.messaging._drivers.amqpdriver [req-a2a28e4a-9e09-426b-b176-c3c5f16be16d ] No calling threads waiting for msg_id : ea968c06105340949c2b130a0786c092, message : {u'_unique_id': u'01ff1c98d63145978fbb734535d20589', u'failure': None, u'result': None}
2015-08-28 05:44:38.046 40095 WARNING oslo.messaging._drivers.amqpdriver [req-a2a28e4a-9e09-426b-b176-c3c5f16be16d ] _queues: {'5afbf8f442dc495d8290298e9c06bc9e': <Queue at 0x369db10 maxsize=None tasks=1 _cond=<Event at 0x372ab90 result=NOT_USED _exc=None _waiters[0]>>}
2015-08-28 05:44:38.100 40095 WARNING oslo.messaging._drivers.amqpdriver [req-a2a28e4a-9e09-426b-b176-c3c5f16be16d ] No calling threads waiting for msg_id : ea968c06105340949c2b130a0786c092, message : {u'_unique_id': u'c92290f3b4f847f998e5408f88a30d97', u'failure': None, u'result': None, u'ending': True}
2015-08-28 05:44:38.101 40095 WARNING oslo.messaging._drivers.amqpdriver [req-a2a28e4a-9e09-426b-b176-c3c5f16be16d ] _queues: {'5afbf8f442dc495d8290298e9c06bc9e': <Queue at 0x369db10 maxsize=None tasks=1 _cond=<Event at 0x372ab90 result=NOT_USED _exc=None _waiters[0]>>}
2015-08-28 05:44:38.105 40095 WARNING oslo.messaging._drivers.amqpdriver [req-a2a28e4a-9e09-426b-b176-c3c5f16be16d ] No calling threads waiting for msg_id : 88f054e542f742acac0ee271768dbf57, message : {u'_unique_id': u'3740812ef08a4967b487c5ac18dc4671', u'failure': None, u'result': None}
2015-08-28 05:44:38.106 40095 WARNING oslo.messaging._drivers.amqpdriver [req-a2a28e4a-9e09-426b-b176-c3c5f16be16d ] _queues: {'5afbf8f442dc495d8290298e9c06bc9e': <Queue at 0x369db10 maxsize=None tasks=1 _cond=<Event at 0x372ab90 result=NOT_USED _exc=None _waiters[0]>>}
2015-08-28 05:44:38.196 40095 WARNING oslo.messaging._drivers.amqpdriver [req-a2a28e4a-9e09-426b-b176-c3c5f16be16d ] No calling threads waiting for msg_id : 88f054e542f742acac0ee271768dbf57, message : {u'_unique_id': u'adf1bb3cbdc1449e99366a3612a4e219', u'failure': None, u'result': None, u'ending': True}
2015-08-28 05:44:38.197 40095 WARNING oslo.messaging._drivers.amqpdriver [req-a2a28e4a-9e09-426b-b176-c3c5f16be16d ] _queues: {'5afbf8f442dc495d8290298e9c06bc9e': <Queue at 0x369db10 maxsize=None tasks=1 _cond=<Event at 0x372ab90 result=NOT_USED _exc=None _waiters[0]>>}
2015-08-28 05:44:42.359 40095 WARNING oslo.messaging._drivers.amqpdriver [req-a2a28e4a-9e09-426b-b176-c3c5f16be16d ] No calling threads waiting for msg_id : e03acfd5c68445b380988c88b29edcd7, message : {u'_unique_id': u'fcb662f50bf442beb89f5a377350e5a2', u'failure': None, u'result': None}
2015-08-28 05:44:42.360 40095 WARNING oslo.messaging._drivers.amqpdriver [req-a2a28e4a-9e09-426b-b176-c3c5f16be16d ] _queues: {'5afbf8f442dc495d8290298e9c06bc9e': <Queue at 0x369db10 maxsize=None tasks=1 _cond=<Event at 0x372ab90 result=NOT_USED _exc=None _waiters[0]>>}
2015-08-28 05:44:42.462 40095 WARNING oslo.messaging._drivers.amqpdriver [req-a2a28e4a-9e09-426b-b176-c3c5f16be16d ] No calling threads waiting for msg_id : e03acfd5c68445b380988c88b29edcd7, message : {u'_unique_id': u'8a4379403bf54ff89e8edaf112af93c3', u'failure': None, u'result': None, u'ending': True}
2015-08-28 05:44:42.463 40095 WARNING oslo.messaging._drivers.amqpdriver [req-a2a28e4a-9e09-426b-b176-c3c5f16be16d ] _queues: {'5afbf8f442dc495d8290298e9c06bc9e': <Queue at 0x369db10 maxsize=None tasks=1 _cond=<Event at 0x372ab90 result=NOT_USED _exc=None _waiters[0]>>}
2015-08-28 05:44:43.065 40095 WARNING oslo.messaging._drivers.amqpdriver [req-a2a28e4a-9e09-426b-b176-c3c5f16be16d ] No calling threads waiting for msg_id : 67786396677c46b7ad7d95ceb1d8c825, message : {u'_unique_id': u'db431d6005964ebda5e8183a1a6a17b3', u'failure': None, u'result': None}
2015-08-28 05:44:43.067 40095 WARNING oslo.messaging._drivers.amqpdriver [req-a2a28e4a-9e09-426b-b176-c3c5f16be16d ] _queues: {'5afbf8f442dc495d8290298e9c06bc9e': <Queue at 0x369db10 maxsize=None tasks=1 _cond=<Event at 0x372ab90 result=NOT_USED _exc=None _waiters[0]>>}
2015-08-28 05:44:43.114 40095 WARNING oslo.messaging._drivers.amqpdriver [req-a2a28e4a-9e09-426b-b176-c3c5f16be16d ] No calling threads waiting for msg_id : 67786396677c46b7ad7d95ceb1d8c825, message : {u'_unique_id': u'9cccefdd6e0a49568726b7d591eacf3a', u'failure': None, u'result': None, u'ending': True}
2015-08-28 05:44:43.114 40095 WARNING oslo.messaging._drivers.amqpdriver [req-a2a28e4a-9e09-426b-b176-c3c5f16be16d ] _queues: {'5afbf8f442dc495d8290298e9c06bc9e': <Queue at 0x369db10 maxsize=None tasks=1 _cond=<Event at 0x372ab90 result=NOT_USED _exc=None _waiters[0]>>}
2015-08-28 05:44:45.327 40095 WARNING oslo.messaging._drivers.amqpdriver [req-a2a28e4a-9e09-426b-b176-c3c5f16be16d ] No calling threads waiting for msg_id : 596ad3b35e5744f8b7e0675e9cf72987, message : {u'_unique_id': u'f6525182203d4e65ab318a13367fe5aa', u'failure': None, u'result': None}
2015-08-28 05:44:45.327 40095 WARNING oslo.messaging._drivers.amqpdriver [req-a2a28e4a-9e09-426b-b176-c3c5f16be16d ] _queues: {'5afbf8f442dc495d8290298e9c06bc9e': <Queue at 0x369db10 maxsize=None tasks=1 _cond=<Event at 0x372ab90 result=NOT_USED _exc=None _waiters[0]>>}
2015-08-28 05:44:45.383 40095 WARNING oslo.messaging._drivers.amqpdriver [req-a2a28e4a-9e09-426b-b176-c3c5f16be16d ] No calling threads waiting for msg_id : 596ad3b35e5744f8b7e0675e9cf72987, message : {u'_unique_id': u'01d81f2e74894d3b965caf162bcec693', u'failure': None, u'result': None, u'ending': True}
2015-08-28 05:44:45.383 40095 WARNING oslo.messaging._drivers.amqpdriver [req-a2a28e4a-9e09-426b-b176-c3c5f16be16d ] _queues: {'5afbf8f442dc495d8290298e9c06bc9e': <

[root@nsj4 ~]# sar -P ALL 60 1
Linux 3.10.0-229.11.1.el7.x86_64 (nsj4) 08/28/2015 _x86_64_ (80 CPU)

06:43:31 AM CPU %user %nice %system %iowait %steal %idle
06:44:31 AM all 3.75 0.00 77.35 0.00 0.00 18.90
06:44:31 AM 0 2.93 0.00 87.15 0.00 0.00 9.92
06:44:31 AM 1 2.95 0.00 85.64 0.00 0.00 11.41
06:44:31 AM 2 3.06 0.00 85.34 0.00 0.00 11.60
06:44:31 AM 3 2.93 0.00 84.70 0.00 0.00 12.37
06:44:31 AM 4 2.90 0.00 82.70 0.00 0.00 14.40
06:44:31 AM 5 2.72 0.00 83.16 0.00 0.00 14.12
06:44:31 AM 6 2.48 0.00 77.42 0.00 0.00 20.11
06:44:31 AM 7 3.16 0.00 81.64 0.00 0.00 15.20
06:44:31 AM 8 2.79 0.00 81.67 0.00 0.00 15.54
06:44:31 AM 9 2.85 0.00 82.88 0.00 0.00 14.27
06:44:31 AM 10 2.84 0.00 78.69 0.00 0.00 18.47
06:44:31 AM 11 2.61 0.00 77.22 0.00 0.00 20.17
06:44:31 AM 12 10.38 0.00 72.14 0.00 0.00 17.48
06:44:31 AM 13 10.11 0.00 73.92 0.00 0.00 15.97
06:44:31 AM 14 12.90 0.00 71.64 0.00 0.00 15.45
06:44:31 AM 15 3.15 0.00 77.23 0.00 0.00 19.62
06:44:31 AM 16 9.18 0.00 69.87 0.00 0.00 20.95
06:44:31 AM 17 2.66 0.00 76.96 0.02 0.00 20.37
06:44:31 AM 18 2.37 0.00 79.16 0.00 0.00 18.47
06:44:31 AM 19 8.62 0.00 74.03 0.00 0.00 17.35
06:44:31 AM 20 2.50 0.00 78.26 0.00 0.00 19.24
06:44:31 AM 21 1.85 0.00 82.39 0.00 0.00 15.76
06:44:31 AM 22 1.84 0.00 78.68 0.00 0.00 19.48
06:44:31 AM 23 2.08 0.00 80.77 0.00 0.00 17.15
06:44:31 AM 24 1.65 0.00 77.46 0.00 0.00 20.89
06:44:31 AM 25 1.86 0.00 81.13 0.00 0.00 17.01
06:44:31 AM 26 1.94 0.00 82.37 0.00 0.00 15.69
06:44:31 AM 27 1.58 0.00 81.70 0.00 0.00 16.72
06:44:31 AM 28 1.93 0.00 80.35 0.00 0.00 17.72
06:44:31 AM 29 2.03 0.00 81.64 0.00 0.00 16.33
06:44:31 AM 30 3.25 0.00 80.76 0.00 0.00 15.98
06:44:31 AM 31 3.39 0.00 76.86 0.00 0.00 19.75
06:44:31 AM 32 3.16 0.00 76.93 0.00 0.00 19.91
06:44:31 AM 33 3.33 0.00 76.24 0.00 0.00 20.44
06:44:31 AM 34 3.31 0.00 77.69 0.00 0.00 19.00
06:44:31 AM 35 2.71 0.00 76.26 0.00 0.00 21.04
06:44:31 AM 36 3.12 0.00 77.72 0.00 0.00 19.16
06:44:31 AM 37 2.84 0.00 76.71 0.00 0.00 20.45
06:44:31 AM 38 2.94 0.00 76.91 0.00 0.00 20.15
06:44:31 AM 39 3.45 0.00 75.24 0.00 0.00 21.31
06:44:31 AM 40 3.20 0.00 77.62 0.00 0.00 19.19
06:44:31 AM 41 3.17 0.00 74.88 0.00 0.00 21.95
06:44:31 AM 42 3.12 0.00 75.38 0.00 0.00 21.51
06:44:31 AM 43 3.60 0.00 76.31 0.00 0.00 20.09
06:44:31 AM 44 3.21 0.00 74.74 0.00 0.00 22.05
06:44:31 AM 45 3.23 0.00 77.64 0.00 0.00 19.12
06:44:31 AM 46 3.43 0.00 76.91 0.00 0.00 19.66
06:44:31 AM 47 3.21 0.00 72.91 0.00 0.00 23.88
06:44:31 AM 48 3.36 0.00 76.26 0.00 0.00 20.38
06:44:31 AM 49 3.51 0.00 71.24 0.00 0.00 25.25
06:44:31 AM 50 2.93 0.00 77.88 0.00 0.00 19.19
06:44:31 AM 51 7.24 0.00 76.29 0.00 0.00 16.47
06:44:31 AM 52 16.88 0.00 66.45 0.00 0.00 16.66
06:44:31 AM 53 5.08 0.00 73.77 0.00 0.00 21.15
06:44:31 AM 54 5.10 0.00 73.80 0.00 0.00 21.10
06:44:31 AM 55 2.80 0.00 75.70 0.00 0.00 21.50
06:44:31 AM 56 2.73 0.00 79.63 0.00 0.00 17.64
06:44:31 AM 57 7.22 0.00 74.76 0.00 0.00 18.02
06:44:31 AM 58 11.35 0.00 70.01 0.00 0.00 18.64
06:44:31 AM 59 4.72 0.00 73.01 0.00 0.00 22.27
06:44:31 AM 60 2.38 0.00 77.60 0.00 0.00 20.02
06:44:31 AM 61 1.81 0.00 76.30 0.00 0.00 21.89
06:44:31 AM 62 2.10 0.00 77.76 0.00 0.00 20.15
06:44:31 AM 63 2.04 0.00 76.05 0.00 0.00 21.90
06:44:31 AM 64 2.43 0.00 78.39 0.00 0.00 19.18
06:44:31 AM 65 2.26 0.00 76.61 0.00 0.00 21.12
06:44:31 AM 66 1.86 0.00 76.79 0.00 0.00 21.35
06:44:31 AM 67 1.81 0.00 76.18 0.00 0.00 22.01
06:44:31 AM 68 2.08 0.00 75.54 0.00 0.00 22.38
06:44:31 AM 69 2.38 0.00 73.08 0.00 0.00 24.54
06:44:31 AM 70 3.89 0.00 76.35 0.00 0.00 19.77
06:44:31 AM 71 2.97 0.00 76.53 0.00 0.00 20.50
06:44:31 AM 72 4.56 0.00 75.57 0.00 0.00 19.88
06:44:31 AM 73 3.44 0.00 80.77 0.00 0.00 15.79
06:44:31 AM 74 3.45 0.00 79.40 0.00 0.00 17.14
06:44:31 AM 75 3.13 0.00 77.52 0.00 0.00 19.35
06:44:31 AM 76 3.34 0.00 77.10 0.00 0.00 19.57
06:44:31 AM 77 3.67 0.00 77.45 0.00 0.00 18.88
06:44:31 AM 78 3.21 0.00 76.19 0.00 0.00 20.60
06:44:31 AM 79 3.60 0.00 73.31 0.00 0.00 23.10

Average: CPU %user %nice %system %iowait %steal %idle
Average: all 3.75 0.00 77.35 0.00 0.00 18.90
Average: 0 2.93 0.00 87.15 0.00 0.00 9.92
Average: 1 2.95 0.00 85.64 0.00 0.00 11.41
Average: 2 3.06 0.00 85.34 0.00 0.00 11.60
Average: 3 2.93 0.00 84.70 0.00 0.00 12.37
Average: 4 2.90 0.00 82.70 0.00 0.00 14.40
Average: 5 2.72 0.00 83.16 0.00 0.00 14.12
Average: 6 2.48 0.00 77.42 0.00 0.00 20.11
Average: 7 3.16 0.00 81.64 0.00 0.00 15.20
Average: 8 2.79 0.00 81.67 0.00 0.00 15.54
Average: 9 2.85 0.00 82.88 0.00 0.00 14.27
Average: 10 2.84 0.00 78.69 0.00 0.00 18.47
Average: 11 2.61 0.00 77.22 0.00 0.00 20.17
Average: 12 10.38 0.00 72.14 0.00 0.00 17.48
Average: 13 10.11 0.00 73.92 0.00 0.00 15.97
Average: 14 12.90 0.00 71.64 0.00 0.00 15.45
Average: 15 3.15 0.00 77.23 0.00 0.00 19.62
Average: 16 9.18 0.00 69.87 0.00 0.00 20.95
Average: 17 2.66 0.00 76.96 0.02 0.00 20.37
Average: 18 2.37 0.00 79.16 0.00 0.00 18.47
Average: 19 8.62 0.00 74.03 0.00 0.00 17.35
Average: 20 2.50 0.00 78.26 0.00 0.00 19.24
Average: 21 1.85 0.00 82.39 0.00 0.00 15.76
Average: 22 1.84 0.00 78.68 0.00 0.00 19.48
Average: 23 2.08 0.00 80.77 0.00 0.00 17.15
Average: 24 1.65 0.00 77.46 0.00 0.00 20.89
Average: 25 1.86 0.00 81.13 0.00 0.00 17.01
Average: 26 1.94 0.00 82.37 0.00 0.00 15.69
Average: 27 1.58 0.00 81.70 0.00 0.00 16.72
Average: 28 1.93 0.00 80.35 0.00 0.00 17.72
Average: 29 2.03 0.00 81.64 0.00 0.00 16.33
Average: 30 3.25 0.00 80.76 0.00 0.00 15.98
Average: 31 3.39 0.00 76.86 0.00 0.00 19.75
Average: 32 3.16 0.00 76.93 0.00 0.00 19.91
Average: 33 3.33 0.00 76.24 0.00 0.00 20.44
Average: 34 3.31 0.00 77.69 0.00 0.00 19.00
Average: 35 2.71 0.00 76.26 0.00 0.00 21.04
Average: 36 3.12 0.00 77.72 0.00 0.00 19.16
Average: 37 2.84 0.00 76.71 0.00 0.00 20.45
Average: 38 2.94 0.00 76.91 0.00 0.00 20.15
Average: 39 3.45 0.00 75.24 0.00 0.00 21.31
Average: 40 3.20 0.00 77.62 0.00 0.00 19.19
Average: 41 3.17 0.00 74.88 0.00 0.00 21.95
Average: 42 3.12 0.00 75.38 0.00 0.00 21.51
Average: 43 3.60 0.00 76.31 0.00 0.00 20.09
Average: 44 3.21 0.00 74.74 0.00 0.00 22.05
Average: 45 3.23 0.00 77.64 0.00 0.00 19.12
Average: 46 3.43 0.00 76.91 0.00 0.00 19.66
Average: 47 3.21 0.00 72.91 0.00 0.00 23.88
Average: 48 3.36 0.00 76.26 0.00 0.00 20.38
Average: 49 3.51 0.00 71.24 0.00 0.00 25.25
Average: 50 2.93 0.00 77.88 0.00 0.00 19.19
Average: 51 7.24 0.00 76.29 0.00 0.00 16.47
Average: 52 16.88 0.00 66.45 0.00 0.00 16.66
Average: 53 5.08 0.00 73.77 0.00 0.00 21.15
Average: 54 5.10 0.00 73.80 0.00 0.00 21.10
Average: 55 2.80 0.00 75.70 0.00 0.00 21.50
Average: 56 2.73 0.00 79.63 0.00 0.00 17.64
Average: 57 7.22 0.00 74.76 0.00 0.00 18.02
Average: 58 11.35 0.00 70.01 0.00 0.00 18.64
Average: 59 4.72 0.00 73.01 0.00 0.00 22.27
Average: 60 2.38 0.00 77.60 0.00 0.00 20.02
Average: 61 1.81 0.00 76.30 0.00 0.00 21.89
Average: 62 2.10 0.00 77.76 0.00 0.00 20.15
Average: 63 2.04 0.00 76.05 0.00 0.00 21.90
Average: 64 2.43 0.00 78.39 0.00 0.00 19.18
Average: 65 2.26 0.00 76.61 0.00 0.00 21.12
Average: 66 1.86 0.00 76.79 0.00 0.00 21.35
Average: 67 1.81 0.00 76.18 0.00 0.00 22.01
Average: 68 2.08 0.00 75.54 0.00 0.00 22.38
Average: 69 2.38 0.00 73.08 0.00 0.00 24.54
Average: 70 3.89 0.00 76.35 0.00 0.00 19.77
Average: 71 2.97 0.00 76.53 0.00 0.00 20.50
Average: 72 4.56 0.00 75.57 0.00 0.00 19.88
Average: 73 3.44 0.00 80.77 0.00 0.00 15.79
Average: 74 3.45 0.00 79.40 0.00 0.00 17.14
Average: 75 3.13 0.00 77.52 0.00 0.00 19.35
Average: 76 3.34 0.00 77.10 0.00 0.00 19.57
Average: 77 3.67 0.00 77.45 0.00 0.00 18.88
Average: 78 3.21 0.00 76.19 0.00 0.00 20.60
Average: 79 3.60 0.00 73.31 0.00 0.00 23.10
[root@nsj4 ~]# free
              total used free shared buff/cache available
Mem: 263961496 13581348 243980080 1533280 6400068 245304560
Swap: 1048572 0 1048572
[root@nsj4 ~]# tail -n 100 /var/log/neutron/lbaas-agent.log

Tags: lbaas
spark (spark-liu)
description: updated
Alan (kaihongd)
summary: - after create 375 LB pool , the new lb -pool and vip get in error status
+ lbaas:after create 375 LB pool , the new lb -pool and vip get in error
+ status
spark (spark-liu)
description: updated
Revision history for this message
Doug Wiegley (dougwig) wrote :

We need to get this triaged.

Changed in neutron:
status: New → Confirmed
importance: Undecided → High
Revision history for this message
min wang (swiftwangster) wrote :

I have setup devstack and installed lbaasv1 plugin, created subnet which allows for the ip address up to 375, create a tenant, based on this tenant created pool, member,vip, and repeat this loop for 375 times, in the end, run the command line above ,but did not find any vip and pool that have error status and all my vip and pool are in active status, so this ticket is not reproduced from my testing script.

stack@test-virtual-machine:~/devstack$ neutron lb-vip-list |grep ERROR
stack@test-virtual-machine:~/devstack$ neutron lb-pool-list |grep ERROR

The following are the two scripts that I wrote, you may try it in your devstack

#!/bin/bash

function create_resources(){
   for i in `seq 1 2`;
   do
      echo "create tenants"
      tenant_id=$(openstack project create tenant-$i -c "id" -f "value")
      echo "tenant id is " $tenant_id

      echo "create pool"
      neutron lb-pool-create --name "pool-$i" --lb-method ROUND_ROBIN --protocol HTTP --tenant-id $tenant_id --subnet-id $(neutron subnet-list | awk '/ demo-subnet / {print $2}')
      sleep 1

      echo "create member"
      neutron lb-member-create --address 10.0.0.0 --protocol-port 80 --tenant-id $tenant_id "pool-$i"
      sleep 1

      echo "create vip"
      neutron lb-vip-create --name "vip-$i" --protocol-port 80 --protocol HTTP --tenant-id $tenant_id --subnet-id $(neutron subnet-list | awk '/ demo-subnet / {print $2}') "pool-$i"
      sleep 3

   done
}
create_resources

#!/bin/bash

function delete_resources(){
   echo "delete member"
   neutron lb-member-list -c "id" -f "value"|sed "s/ /\n/g" | xargs -L 1 neutron lb-member-delete
   for i in `seq 1 2`;
   do
      echo "Delete vip"
      neutron lb-vip-delete "vip-$i"

      echo "delete pool"
      neutron lb-pool-delete pool-$i

      echo "Delete tenants"
      openstack project delete tenant-$i

   done
}
delete_resources

Revision history for this message
min wang (swiftwangster) wrote :

also you need to change the loop from for i in `seq 1 2` to for i in `seq 1 375`;

Revision history for this message
Ashutosh Mishra (mca-ashu4) wrote :

I tried in my devstack setup, creating lbaas resources with @min wang script.
Created 500 resources, doesn't observe any such issue.

stack@Devstack-5:~$
stack@Devstack-5:~$ neutron lb-pool-list | grep pool-49
| 040b9fc0-e13d-4788-949b-464dbba0b33e | pool-49 | haproxy | ROUND_ROBIN | HTTP | True | ACTIVE |
| 1196f3a9-39ee-4039-b368-8814d2f4af39 | pool-490 | haproxy | ROUND_ROBIN | HTTP | True | ACTIVE |
| 16cdca16-a01a-486b-8fd4-6870f6c4d83e | pool-497 | haproxy | ROUND_ROBIN | HTTP | True | ACTIVE |
| 460c88fb-630f-40ff-80ec-eaff3451fc54 | pool-499 | haproxy | ROUND_ROBIN | HTTP | True | ACTIVE |
| 54cf6250-1cad-48f9-85db-568cb4140d5f | pool-495 | haproxy | ROUND_ROBIN | HTTP | True | ACTIVE |
| 6604894a-149d-4e58-a915-cae141f36443 | pool-496 | haproxy | ROUND_ROBIN | HTTP | True | ACTIVE |
| 9ad4708e-06e2-4e82-bd99-0cdbe7d365c2 | pool-494 | haproxy | ROUND_ROBIN | HTTP | True | ACTIVE |
| ccca37e3-9f62-4960-a474-937c71ee8403 | pool-498 | haproxy | ROUND_ROBIN | HTTP | True | ACTIVE |
| d26cb831-3b7c-491f-842d-c71a76eb6645 | pool-492 | haproxy | ROUND_ROBIN | HTTP | True | ACTIVE |
| dd5b1758-34df-4ed2-9c8b-ada90dcad8e5 | pool-491 | haproxy | ROUND_ROBIN | HTTP | True | ACTIVE |
| fbc4c9f6-74c6-4c49-9420-0b0527a05574 | pool-493 | haproxy | ROUND_ROBIN | HTTP | True | ACTIVE |
stack@Devstack-5:~$ neutron lb-vip-list | grep vip-49
| 0fd42d6c-1f3d-4556-83d6-0f3a8fa21475 | vip-497 | 21.0.3.112 | HTTP | True | ACTIVE |
| 1ea674da-a10c-4b16-8bc2-faf49fdaf358 | vip-499 | 21.0.3.114 | HTTP | True | ACTIVE |
| 2d89b496-3174-42e8-ba61-295c19a8c193 | vip-496 | 21.0.3.111 | HTTP | True | ACTIVE |
| 2e5f3a55-8ef0-41c5-a623-723c70f94987 | vip-491 | 21.0.3.106 | HTTP | True | ACTIVE |
| 30bc61f7-1800-43ac-92d8-1b7f9d4d33c1 | vip-49 | 21.0.1.176 | HTTP | True | ACTIVE |
| 31983624-9a0c-4bfd-8a61-abad9ad85612 | vip-493 | 21.0.3.108 | HTTP | True | ACTIVE |
| 60caf97b-16e0-4208-a859-a4ccf6ce776e | vip-494 | 21.0.3.109 | HTTP | True | ACTIVE |
| 8b2d9b93-1748-4b9c-8ca4-46a740761e22 | vip-495 | 21.0.3.110 | HTTP | True | ACTIVE |
| ba2a2d6f-dd30-479f-8fbd-fe348a715762 | vip-492 | 21.0.3.107 | HTTP | True | ACTIVE |
| dc5d00e0-9b62-4b69-912e-58e21a2d4ca1 | vip-490 | 21.0.3.105 | HTTP | True | ACTIVE |
| f9e82a8b-fda2-4c31-8aac-3038dd6bfbc5 | vip-498 | 21.0.3.113 | HTTP | True | ACTIVE |
stack@Devstack-5:~$
stack@Devstack-5:~$

Revision history for this message
Wei Wang (damon-devops) wrote :

more like a mq problem

Changed in neutron:
status: Confirmed → Incomplete
affects: neutron → octavia
Revision history for this message
Launchpad Janitor (janitor) wrote :

[Expired for octavia because there has been no activity for 60 days.]

Changed in octavia:
status: Incomplete → Expired
To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.