TestNetworkBasicOps.test_network_basic_ops failed with "Timed out waiting for 172.24.5.8 to become reachable"

Bug #1711463 reported by Ihar Hrachyshka on 2017-08-17
8
This bug affects 1 person
Affects Status Importance Assigned to Milestone
neutron
High
Miguel Lavalle

Bug Description

http://logs.openstack.org/45/493945/2/gate/gate-grenade-dsvm-neutron-dvr-multinode-ubuntu-xenial/c101360/logs/testr_results.html.gz

2017-08-17 19:34:47,447 1906 DEBUG [tempest.scenario.manager] checking network connections to IP 172.24.5.8 with user: cirros
2017-08-17 19:34:47,448 1906 DEBUG [tempest.scenario.manager] TestNetworkBasicOps:test_network_basic_ops begins to ping 172.24.5.8 in 120 sec and the expected result is reachable
2017-08-17 19:36:47,871 1906 DEBUG [tempest.scenario.manager] TestNetworkBasicOps:test_network_basic_ops finishes ping 172.24.5.8 in 120 sec and the ping result is unexpected
2017-08-17 19:36:47,872 1906 ERROR [tempest.scenario.manager] Public network connectivity check failed: after re-associate floating ip
2017-08-17 19:36:47.872 1906 ERROR tempest.scenario.manager Traceback (most recent call last):
2017-08-17 19:36:47.872 1906 ERROR tempest.scenario.manager File "tempest/scenario/manager.py", line 609, in check_public_network_connectivity
2017-08-17 19:36:47.872 1906 ERROR tempest.scenario.manager mtu=mtu)
2017-08-17 19:36:47.872 1906 ERROR tempest.scenario.manager File "tempest/scenario/manager.py", line 592, in check_vm_connectivity
2017-08-17 19:36:47.872 1906 ERROR tempest.scenario.manager msg=msg)
2017-08-17 19:36:47.872 1906 ERROR tempest.scenario.manager File "/opt/stack/new/tempest/.tox/tempest/local/lib/python2.7/site-packages/unittest2/case.py", line 702, in assertTrue
2017-08-17 19:36:47.872 1906 ERROR tempest.scenario.manager raise self.failureException(msg)
2017-08-17 19:36:47.872 1906 ERROR tempest.scenario.manager AssertionError: False is not true : Timed out waiting for 172.24.5.8 to become reachable

In l3 agent log, we see the address is configured:

Aug 17 19:33:40.782930 ubuntu-xenial-2-node-citycloud-kna1-10495241 neutron-l3-agent[21093]: DEBUG neutron.agent.linux.utils [-] Running command (rootwrap daemon): ['ip', 'netns', 'exec', 'fip-8df87dc6-670f-4f85-861e-3f041537e632', 'ip', '-4', 'route', 'replace', '172.24.5.8/32', 'via', '169.254.95.212', 'dev', 'fpr-d2baa122-5'] {{(pid=21093) execute_rootwrap_daemon /opt/stack/new/neutron/neutron/agent/linux/utils.py:108}}
Aug 17 19:33:40.857528 ubuntu-xenial-2-node-citycloud-kna1-10495241 neutron-l3-agent[21093]: DEBUG neutron.agent.linux.utils [-] Running command (rootwrap daemon): ['ip', 'netns', 'exec', 'fip-8df87dc6-670f-4f85-861e-3f041537e632', 'arping', '-U', '-I', 'fg-1090ccce-f5', '-c', '1', '-w', '1.5', '172.24.5.8'] {{(pid=21093) execute_rootwrap_daemon /opt/stack/new/neutron/neutron/agent/linux/utils.py:108}}

No traces in q-agt or q-l3 for the port, or at all. It will probably be hard to debug the failure, but reporting nevertheless, if anything for tracking purposes.

The failure is in Q (current master), but since it's grenade, it's probably Pike code.

Changed in neutron:
importance: Undecided → High
tags: added: gate-failure l3-dvr-backlog
Changed in neutron:
status: New → Confirmed

I will keep an eye on this failure.

Miguel Lavalle (minsel) on 2018-03-15
Changed in neutron:
assignee: nobody → Miguel Lavalle (minsel)
To post a comment you must log in.
This report contains Public information  Edit
Everyone can see this information.

Other bug subscribers