Lbaasv2 command logs not seen

Bug #1464241 reported by Alex Stafeyev
24
This bug affects 4 people
Affects Status Importance Assigned to Milestone
octavia
Won't Fix
High
Unassigned

Bug Description

I am testing incorrect and correct lbaasv2 deletion.
even if a command fails we do not see it in the /var/log/neutron/lbaasv2-agent.log

BUT
We see the lbaas (not lbaasv2) is being updated with information and has error.

2015-06-11 03:03:34.352 21274 WARNING neutron.openstack.common.loopingcall [-] task <bound method LbaasAgentManager.run_periodic_tasks of <neutron_lbaas.services.loadbalancer.agent.agent_manager.LbaasAgentManager object at 0x274dfd0>> run outlasted interval by 50.10 sec
2015-06-11 03:04:34.366 21274 ERROR neutron_lbaas.services.loadbalancer.agent.agent_manager [-] Unable to retrieve ready devices
2015-06-11 03:04:34.366 21274 TRACE neutron_lbaas.services.loadbalancer.agent.agent_manager Traceback (most recent call last):
2015-06-11 03:04:34.366 21274 TRACE neutron_lbaas.services.loadbalancer.agent.agent_manager File "/usr/lib/python2.7/site-packages/neutron_lbaas/services/loadbalancer/agent/agent_manager.py", line 152, in sync_state
2015-06-11 03:04:34.366 21274 TRACE neutron_lbaas.services.loadbalancer.agent.agent_manager ready_instances = set(self.plugin_rpc.get_ready_devices())
2015-06-11 03:04:34.366 21274 TRACE neutron_lbaas.services.loadbalancer.agent.agent_manager File "/usr/lib/python2.7/site-packages/neutron_lbaas/services/loadbalancer/agent/agent_api.py", line 36, in get_ready_devices
2015-06-11 03:04:34.366 21274 TRACE neutron_lbaas.services.loadbalancer.agent.agent_manager return cctxt.call(self.context, 'get_ready_devices', host=self.host)
2015-06-11 03:04:34.366 21274 TRACE neutron_lbaas.services.loadbalancer.agent.agent_manager File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/client.py", line 156, in call
2015-06-11 03:04:34.366 21274 TRACE neutron_lbaas.services.loadbalancer.agent.agent_manager retry=self.retry)
2015-06-11 03:04:34.366 21274 TRACE neutron_lbaas.services.loadbalancer.agent.agent_manager File "/usr/lib/python2.7/site-packages/oslo_messaging/transport.py", line 90, in _send
2015-06-11 03:04:34.366 21274 TRACE neutron_lbaas.services.loadbalancer.agent.agent_manager timeout=timeout, retry=retry)
2015-06-11 03:04:34.366 21274 TRACE neutron_lbaas.services.loadbalancer.agent.agent_manager File "/usr/lib/python2.7/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 350, in send
2015-06-11 03:04:34.366 21274 TRACE neutron_lbaas.services.loadbalancer.agent.agent_manager retry=retry)
2015-06-11 03:04:34.366 21274 TRACE neutron_lbaas.services.loadbalancer.agent.agent_manager File "/usr/lib/python2.7/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 339, in _send
2015-06-11 03:04:34.366 21274 TRACE neutron_lbaas.services.loadbalancer.agent.agent_manager result = self._waiter.wait(msg_id, timeout)
2015-06-11 03:04:34.366 21274 TRACE neutron_lbaas.services.loadbalancer.agent.agent_manager File "/usr/lib/python2.7/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 243, in wait
2015-06-11 03:04:34.366 21274 TRACE neutron_lbaas.services.loadbalancer.agent.agent_manager message = self.waiters.get(msg_id, timeout=timeout)
2015-06-11 03:04:34.366 21274 TRACE neutron_lbaas.services.loadbalancer.agent.agent_manager File "/usr/lib/python2.7/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 149, in get
2015-06-11 03:04:34.366 21274 TRACE neutron_lbaas.services.loadbalancer.agent.agent_manager 'to message ID %s' % msg_id)
2015-06-11 03:04:34.366 21274 TRACE neutron_lbaas.services.loadbalancer.agent.agent_manager MessagingTimeout: Timed out waiting for a reply to message ID 73130a6bb5444f259dbf810cfb1003b3
2015-06-11 03:04:34.366 21274 TRACE neutron_lbaas.services.loadbalancer.agent.agent_manager

configure lbaasv2 setup- loadbalncer, listener, member, pool, healthmonitor.

see lbaasv2 logs and lbaas logs
 /var/log/neutron/lbaasv2-agent.log
 /var/log/neutron/lbaasv-agent.log

lbaasv2
kilo
rhel7.1
openstack-neutron-lbaas-2015.1.0-3.el7ost.noarch
python-neutron-lbaas-2015.1.0-3.el7ost.noarch

Tags: lbaas
Revision history for this message
Prashant (pczanwar) wrote :

I am getting this too - I have same setup as above ..

2015-09-23 11:55:37.512 6235 ERROR neutron_lbaas.services.loadbalancer.agent.agent_manager [-] Unable to retrieve ready devices
2015-09-23 11:55:37.512 6235 TRACE neutron_lbaas.services.loadbalancer.agent.agent_manager Traceback (most recent call last):
2015-09-23 11:55:37.512 6235 TRACE neutron_lbaas.services.loadbalancer.agent.agent_manager File "/usr/lib/python2.7/site-packages/neutron_lbaas/services/loadbalancer/agent/agent_manager.py", line 152, in sync_state
2015-09-23 11:55:37.512 6235 TRACE neutron_lbaas.services.loadbalancer.agent.agent_manager ready_instances = set(self.plugin_rpc.get_ready_devices())
2015-09-23 11:55:37.512 6235 TRACE neutron_lbaas.services.loadbalancer.agent.agent_manager File "/usr/lib/python2.7/site-packages/neutron_lbaas/services/loadbalancer/agent/agent_api.py", line 36, in get_ready_devices
2015-09-23 11:55:37.512 6235 TRACE neutron_lbaas.services.loadbalancer.agent.agent_manager return cctxt.call(self.context, 'get_ready_devices', host=self.host)
2015-09-23 11:55:37.512 6235 TRACE neutron_lbaas.services.loadbalancer.agent.agent_manager File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/client.py", line 156, in call
2015-09-23 11:55:37.512 6235 TRACE neutron_lbaas.services.loadbalancer.agent.agent_manager retry=self.retry)
2015-09-23 11:55:37.512 6235 TRACE neutron_lbaas.services.loadbalancer.agent.agent_manager File "/usr/lib/python2.7/site-packages/oslo_messaging/transport.py", line 90, in _send
2015-09-23 11:55:37.512 6235 TRACE neutron_lbaas.services.loadbalancer.agent.agent_manager timeout=timeout, retry=retry)
2015-09-23 11:55:37.512 6235 TRACE neutron_lbaas.services.loadbalancer.agent.agent_manager File "/usr/lib/python2.7/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 350, in send
2015-09-23 11:55:37.512 6235 TRACE neutron_lbaas.services.loadbalancer.agent.agent_manager retry=retry)
2015-09-23 11:55:37.512 6235 TRACE neutron_lbaas.services.loadbalancer.agent.agent_manager File "/usr/lib/python2.7/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 339, in _send
2015-09-23 11:55:37.512 6235 TRACE neutron_lbaas.services.loadbalancer.agent.agent_manager result = self._waiter.wait(msg_id, timeout)
2015-09-23 11:55:37.512 6235 TRACE neutron_lbaas.services.loadbalancer.agent.agent_manager File "/usr/lib/python2.7/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 243, in wait
2015-09-23 11:55:37.512 6235 TRACE neutron_lbaas.services.loadbalancer.agent.agent_manager message = self.waiters.get(msg_id, timeout=timeout)
2015-09-23 11:55:37.512 6235 TRACE neutron_lbaas.services.loadbalancer.agent.agent_manager File "/usr/lib/python2.7/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 149, in get
2015-09-23 11:55:37.512 6235 TRACE neutron_lbaas.services.loadbalancer.agent.agent_manager 'to message ID %s' % msg_id)
2015-09-23 11:55:37.512 6235 TRACE neutron_lbaas.services.loadbalancer.agent.agent_manager MessagingTimeout: Timed out waiting for a reply to message ID c37f2badb3bd4f259905dcb79130d3ff

Revision history for this message
Daneyon Hansen (danehans) wrote :
Revision history for this message
Piotr Kopec (piotrkopec-deactivatedaccount) wrote :

I am getting this too. Installed using Fuel 8 and setup according to Liberty Networking documentation.

Changed in neutron:
importance: Undecided → High
affects: neutron → octavia
Revision history for this message
Michael Johnson (johnsom) wrote :

This is against LBaaSv1 code (agent) and driver. LBaaSv1 has been removed from the code as Mitaka is now EOL.

Changed in octavia:
status: New → Won't Fix
Revision history for this message
Haiwei Xu (xu-haiwei) wrote :
Download full text (4.4 KiB)

But from the Ocata version, I still have this issue.

2017-09-15 17:40:24.115 32107 ERROR neutron.common.rpc [-] Timeout in RPC method get_ready_devices. Waiting for 22 seconds before next attempt. If the server is not down, consider increasing the rpc_response_timeout option as Neutron server(s) may be overloaded and unable to respond quickly enough.
2017-09-15 17:40:24.115 32107 WARNING neutron.common.rpc [-] Increasing timeout for get_ready_devices calls to 480 seconds. Restart the agent to restore it to the default value.
2017-09-15 17:40:46.413 32107 ERROR neutron_lbaas.agent.agent_manager [-] Unable to retrieve ready devices
2017-09-15 17:40:46.413 32107 ERROR neutron_lbaas.agent.agent_manager Traceback (most recent call last):
2017-09-15 17:40:46.413 32107 ERROR neutron_lbaas.agent.agent_manager File "/usr/lib/python2.7/dist-packages/neutron_lbaas/agent/agent_manager.py", line 155, in sync_state
2017-09-15 17:40:46.413 32107 ERROR neutron_lbaas.agent.agent_manager ready_instances = set(self.plugin_rpc.get_ready_devices())
2017-09-15 17:40:46.413 32107 ERROR neutron_lbaas.agent.agent_manager File "/usr/lib/python2.7/dist-packages/neutron_lbaas/agent/agent_api.py", line 34, in get_ready_devices
2017-09-15 17:40:46.413 32107 ERROR neutron_lbaas.agent.agent_manager return cctxt.call(self.context, 'get_ready_devices', host=self.host)
2017-09-15 17:40:46.413 32107 ERROR neutron_lbaas.agent.agent_manager File "/usr/lib/python2.7/dist-packages/neutron/common/rpc.py", line 174, in call
2017-09-15 17:40:46.413 32107 ERROR neutron_lbaas.agent.agent_manager time.sleep(wait)
2017-09-15 17:40:46.413 32107 ERROR neutron_lbaas.agent.agent_manager File "/usr/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 220, in __exit__
2017-09-15 17:40:46.413 32107 ERROR neutron_lbaas.agent.agent_manager self.force_reraise()
2017-09-15 17:40:46.413 32107 ERROR neutron_lbaas.agent.agent_manager File "/usr/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 196, in force_reraise
2017-09-15 17:40:46.413 32107 ERROR neutron_lbaas.agent.agent_manager six.reraise(self.type_, self.value, self.tb)
2017-09-15 17:40:46.413 32107 ERROR neutron_lbaas.agent.agent_manager File "/usr/lib/python2.7/dist-packages/neutron/common/rpc.py", line 151, in call
2017-09-15 17:40:46.413 32107 ERROR neutron_lbaas.agent.agent_manager return self._original_context.call(ctxt, method, **kwargs)
2017-09-15 17:40:46.413 32107 ERROR neutron_lbaas.agent.agent_manager File "/usr/lib/python2.7/dist-packages/oslo_messaging/rpc/client.py", line 169, in call
2017-09-15 17:40:46.413 32107 ERROR neutron_lbaas.agent.agent_manager retry=self.retry)
2017-09-15 17:40:46.413 32107 ERROR neutron_lbaas.agent.agent_manager File "/usr/lib/python2.7/dist-packages/oslo_messaging/transport.py", line 97, in _send
2017-09-15 17:40:46.413 32107 ERROR neutron_lbaas.agent.agent_manager timeout=timeout, retry=retry)
2017-09-15 17:40:46.413 32107 ERROR neutron_lbaas.agent.agent_manager File "/usr/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 458, in send
2017-09-15 17:40:46.413 32107 ERROR neutron_lbaas.agent.agent_manager retry=r...

Read more...

To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.