Activity log for bug #1735823

Date Who What changed Old value New value Message
2017-12-01 19:53:53 David Moreau Simard bug added bug
2017-12-01 19:54:12 David Moreau Simard summary Tempest makes Nova hang when creating a VM with file injection Nova can hang when creating a VM with file injection
2017-12-01 19:56:54 David Moreau Simard description We are noticing recurring failures in the gate under CentOS and OpenSUSE across Devstack and Packstack jobs on the master branches. Example failures: - CentOS Devstack: http://logs.openstack.org/46/523646/1/check/legacy-tempest-dsvm-neutron-full-centos-7/5bf092c/job-output.txt#_2017-11-29_03_02_38_031560 - OpenSUSE Devstack: http://logs.openstack.org/23/522423/7/check/legacy-tempest-dsvm-neutron-full-opensuse-423/b6768d7/job-output.txt#_2017-11-27_21_53_13_340319 - CentOS Packstack: http://logs.openstack.org/14/516714/1/check/packstack-integration-scenario002-tempest/7ba8d06/job-output.txt.gz#_2017-10-31_17_55_39_845816 They all fail with the same stack trace: ===== setUpClass (tempest.api.compute.servers.test_create_server.ServersTestJSON) --------------------------------------------------------------------------- Captured traceback: ~~~~~~~~~~~~~~~~~~~ Traceback (most recent call last): File "tempest/test.py", line 172, in setUpClass six.reraise(etype, value, trace) File "tempest/test.py", line 165, in setUpClass cls.resource_setup() File "tempest/api/compute/servers/test_create_server.py", line 64, in resource_setup volume_backed=cls.volume_backed) File "tempest/api/compute/base.py", line 190, in create_test_server **kwargs) File "tempest/common/compute.py", line 258, in create_test_server server['id']) File "/opt/stack/new/tempest/.tox/tempest/lib/python2.7/site-packages/oslo_utils/excutils.py", line 220, in __exit__ self.force_reraise() File "/opt/stack/new/tempest/.tox/tempest/lib/python2.7/site-packages/oslo_utils/excutils.py", line 196, in force_reraise six.reraise(self.type_, self.value, self.tb) File "tempest/common/compute.py", line 229, in create_test_server clients.servers_client, server['id'], wait_until) File "tempest/common/waiters.py", line 96, in wait_for_server_status raise lib_exc.TimeoutException(message) tempest.lib.exceptions.TimeoutException: Request timed out Details: (ServersTestJSON:setUpClass) Server 2f8de011-b218-4e73-b9e3-e7fcf9e9278b failed to reach ACTIVE status and task state "None" within the required time (196 s). Current status: BUILD. Current task state: spawning. ===== We are noticing recurring failures in the gate under CentOS and OpenSUSE across Devstack and Packstack jobs on the master branches. This has been happening only on the OVH cloud regions as far as we know. Example failures: - CentOS Devstack: http://logs.openstack.org/46/523646/1/check/legacy-tempest-dsvm-neutron-full-centos-7/5bf092c/job-output.txt#_2017-11-29_03_02_38_031560 - OpenSUSE Devstack: http://logs.openstack.org/23/522423/7/check/legacy-tempest-dsvm-neutron-full-opensuse-423/b6768d7/job-output.txt#_2017-11-27_21_53_13_340319 - CentOS Packstack: http://logs.openstack.org/14/516714/1/check/packstack-integration-scenario002-tempest/7ba8d06/job-output.txt.gz#_2017-10-31_17_55_39_845816 They all fail with the same stack trace: ===== setUpClass (tempest.api.compute.servers.test_create_server.ServersTestJSON) --------------------------------------------------------------------------- Captured traceback: ~~~~~~~~~~~~~~~~~~~     Traceback (most recent call last):       File "tempest/test.py", line 172, in setUpClass         six.reraise(etype, value, trace)       File "tempest/test.py", line 165, in setUpClass         cls.resource_setup()       File "tempest/api/compute/servers/test_create_server.py", line 64, in resource_setup         volume_backed=cls.volume_backed)       File "tempest/api/compute/base.py", line 190, in create_test_server         **kwargs)       File "tempest/common/compute.py", line 258, in create_test_server         server['id'])       File "/opt/stack/new/tempest/.tox/tempest/lib/python2.7/site-packages/oslo_utils/excutils.py", line 220, in __exit__         self.force_reraise()       File "/opt/stack/new/tempest/.tox/tempest/lib/python2.7/site-packages/oslo_utils/excutils.py", line 196, in force_reraise         six.reraise(self.type_, self.value, self.tb)       File "tempest/common/compute.py", line 229, in create_test_server         clients.servers_client, server['id'], wait_until)       File "tempest/common/waiters.py", line 96, in wait_for_server_status         raise lib_exc.TimeoutException(message)     tempest.lib.exceptions.TimeoutException: Request timed out     Details: (ServersTestJSON:setUpClass) Server 2f8de011-b218-4e73-b9e3-e7fcf9e9278b failed to reach ACTIVE status and task state "None" within the required time (196 s). Current status: BUILD. Current task state: spawning. =====
2017-12-01 20:09:50 Matt Riedemann summary Nova can hang when creating a VM with file injection Nova can hang when creating a VM with disk injection
2017-12-01 20:14:28 Matt Riedemann tags guestfs injection
2017-12-01 22:10:58 Matt Riedemann nova: status New Confirmed
2017-12-01 22:43:54 OpenStack Infra nova: status Confirmed In Progress
2017-12-01 22:43:54 OpenStack Infra nova: assignee Matt Riedemann (mriedem)
2017-12-01 22:48:24 Matt Riedemann nova: importance Undecided Medium
2017-12-01 22:48:27 Matt Riedemann nominated for series nova/ocata
2017-12-01 22:48:27 Matt Riedemann bug task added nova/ocata
2017-12-01 22:48:27 Matt Riedemann nominated for series nova/pike
2017-12-01 22:48:27 Matt Riedemann bug task added nova/pike
2017-12-05 13:18:50 Kashyap Chamarthy bug added subscriber Kashyap Chamarthy