tripleo-ci-fedora-28-standalone job are failing consistently at following tests:
* tempest.scenario.test_network_basic_ops.TestNetworkBasicOps [1.]
* tempest.scenario.test_volume_boot_pattern.TestVolumeBootPattern [2.]
Both are failing with same reason:
[1.] http://logs.openstack.org/97/661697/1/check/tripleo-ci-fedora-28-standalone/e55278e/logs/tempest.html.gz
Traceback (most recent call last):
File "/usr/lib/python3.6/site-packages/tempest/common/utils/__init__.py", line 89, in wrapper
return f(*func_args, **func_kwargs)
File "/usr/lib/python3.6/site-packages/tempest/scenario/test_network_basic_ops.py", line 708, in test_preserve_preexisting_port
self._setup_network_and_servers(boot_with_port=True)
File "/usr/lib/python3.6/site-packages/tempest/scenario/test_network_basic_ops.py", line 119, in _setup_network_and_servers
server = self._create_server(self.network, port_id)
File "/usr/lib/python3.6/site-packages/tempest/scenario/test_network_basic_ops.py", line 171, in _create_server
security_groups=security_groups)
File "/usr/lib/python3.6/site-packages/tempest/scenario/manager.py", line 235, in create_server
image_id=image_id, **kwargs)
File "/usr/lib/python3.6/site-packages/tempest/common/compute.py", line 265, in create_test_server
server['id'])
File "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 220, in __exit__
self.force_reraise()
File "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 196, in force_reraise
six.reraise(self.type_, self.value, self.tb)
File "/usr/lib/python3.6/site-packages/six.py", line 693, in reraise
raise value
File "/usr/lib/python3.6/site-packages/tempest/common/compute.py", line 236, in create_test_server
clients.servers_client, server['id'], wait_until)
File "/usr/lib/python3.6/site-packages/tempest/common/waiters.py", line 76, in wait_for_server_status
server_id=server_id)
tempest.exceptions.BuildErrorException: Server 728e4226-3ffa-4a34-b570-8f84601a72fb failed to build and is in ERROR status
Details: {'code': 500, 'created': '2019-05-28T10:03:57Z', 'message': 'Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 728e4226-3ffa-4a34-b570-8f84601a72fb.'}
[2.] http://logs.openstack.org/99/661599/1/gate/tripleo-ci-fedora-28-standalone/6a1da25/logs/tempest.html.gz
Traceback (most recent call last):
File "/usr/lib/python3.6/site-packages/tempest/common/utils/__init__.py", line 89, in wrapper
return f(*func_args, **func_kwargs)
File "/usr/lib/python3.6/site-packages/tempest/scenario/test_volume_boot_pattern.py", line 131, in test_volume_boot_pattern
security_group=security_group)
File "/usr/lib/python3.6/site-packages/tempest/scenario/test_volume_boot_pattern.py", line 68, in _boot_instance_from_resource
return self.create_server(image_id='', **create_kwargs)
File "/usr/lib/python3.6/site-packages/tempest/scenario/manager.py", line 235, in create_server
image_id=image_id, **kwargs)
File "/usr/lib/python3.6/site-packages/tempest/common/compute.py", line 265, in create_test_server
server['id'])
File "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 220, in __exit__
self.force_reraise()
File "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 196, in force_reraise
six.reraise(self.type_, self.value, self.tb)
File "/usr/lib/python3.6/site-packages/six.py", line 693, in reraise
raise value
File "/usr/lib/python3.6/site-packages/tempest/common/compute.py", line 236, in create_test_server
clients.servers_client, server['id'], wait_until)
File "/usr/lib/python3.6/site-packages/tempest/common/waiters.py", line 76, in wait_for_server_status
server_id=server_id)
tempest.exceptions.BuildErrorException: Server 6312266c-263b-4ca5-8484-85f019af7db2 failed to build and is in ERROR status
Details: {'code': 500, 'created': '2019-05-28T08:34:25Z', 'message': 'Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 6312266c-263b-4ca5-8484-85f019af7db2.'}
Needs to investigate further to find the failure reasons.
http:// logs.openstack. org/99/ 661599/ 1/gate/ tripleo- ci-fedora- 28-standalone/ 6a1da25/ logs/undercloud /var/log/ containers/ nova/nova- compute. log.txt. gz#_2019- 05-28_08_ 34_15_142
This seems to be related to concurrency on network, we should have network guys taking a look.
2019-05-28 08:34:15.142 8 ERROR vif_plug_ ovs.ovsdb. impl_vsctl [req-e7b4f0d3- 33d1-4dee- 8646-de4ac02b61 49 f316983829a74e1 eaf29f164c4e5ce cc 246b9cbfe0a9451 4962f71bab62a30 f4 - default default] Unable to execute ['ovs-vsctl', '--timeout=120', '--oneline', '--format=json', '--db=tcp: 127.0.0. 1:6640' , '--', '--may-exist', 'add-br', 'br-int', '--', 'set', 'Bridge', 'br-int', 'datapath_ type=system' ]. Exception: Unexpected error while running command. 127.0.0. 1:6640 -- --may-exist add-br br-int -- set Bridge br-int datapath_ type=system y.processutils. ProcessExecutio nError: Unexpected error while running command. 33d1-4dee- 8646-de4ac02b61 49 f316983829a74e1 eaf29f164c4e5ce cc 246b9cbfe0a9451 4962f71bab62a30 f4 - default default] Successfully plugged vif VIFOpenVSwitch( active= False,address= fa:16:3e: 0b:35:f8, bridge_ name='br- int',has_ traffic_ filtering= True,id= 3926672a- c06d-4455- a7c2-a718c0d159 07,network= Network( 1dfcc0ad- a672-43d5- 800b-52d4413307 a8),plugin= 'ovs',port_ profile= VIFPortProfileO penVSwitch, preserve_ on_delete= False,vif_ name='tap392667 2a-c0') libvirt. driver [req-e7b4f0d3- 33d1-4dee- 8646-de4ac02b61 49 f316983829a74e1 eaf29f164c4e5ce cc 246b9cbfe0a9451 4962f71bab62a30 f4 - default default] No BDM found with device name vda, not building metadata. _build_ disk_metadata /usr/lib/ python3. 6/site- packages/ nova/virt/ libvirt/ driver. py:9366 libvirt. driver [req-e7b4f0d3- 33d1-4dee- 8646-de4ac02b61 49 f316983829a74e1 eaf29f164c4e5ce cc 246b9cbfe0a9451 4962f71bab62a30 f4 - default default] No VIF found with MAC fa:16:3e:0b:35:f8, not building metadata _build_ interface_ metadata /usr/lib/ python3. 6/site- packages/ nova/virt/ libvirt/ driver. py:9342 neutronv2. api [req-7051defb- 3e74-4dd8- adb3-c03c211f67 24 307ac996d216442 ab5a5659f4b0c0a ec 3a5aed9cb1b6456 6bf5646998fbba0 91 - default default] [instance: 6312266c- 263b-4ca5- 8484-85f019af7d b2] Updated VIF entry in instance network info cache for port 3926672a- c06d-4455- a7c2-a718c0d159 07. _build_ network_ info_model /usr/lib/ python3. 6/site- packages/ nova/network/ neutronv2/ api.py: 2971 base_api [req-7051defb- 3e74-4dd8- adb3-c03c211f67 24 307ac996d216442 ab5a5659f4b0c0a ec 3a5aed9cb1b6456 6bf5646998fbba0 91 - default default] [instance: 6312266c- 263b-4ca5- 8484-85f019af7d b2] Updating instance_info_cache with network_info: [{"id": "3926672a- c06d-4455- a7c2-a718c0d159 07", "address": "fa:16: 3e:0b:35: f8", "network": {"id": "1dfcc0ad- a672-43d5- 800b-52d4413307 a8", "bridge": "br-int", "label": "tempest- TestVolumeBootP attern- 1415094620- network" , "subnets": [{"cidr": "10.100.0.0/28", "dns": ...
Command: ovs-vsctl --timeout=120 --oneline --format=json --db=tcp:
Exit code: 1
Stdout: ''
Stderr: 'ovs-vsctl: tcp:127.0.0.1:6640: database connection failed (Connection refused)\n': oslo_concurrenc
2019-05-28 08:34:15.143 8 INFO os_vif [req-e7b4f0d3-
2019-05-28 08:34:15.287 8 DEBUG nova.virt.
2019-05-28 08:34:15.288 8 DEBUG nova.virt.
2019-05-28 08:34:15.377 8 DEBUG nova.network.
2019-05-28 08:34:15.378 8 DEBUG nova.network.