component compute test_port_security_disable_security_group tempest failing often

Bug #1869165 reported by Marios Andreou
6
This bug affects 1 person
Affects Status Importance Assigned to Milestone
tripleo
Fix Released
Critical
Unassigned

Bug Description

In [1] and [2] for component compute, the periodic-tripleo-ci-centos-8-standalone-full-tempest-scenario-compute-master (until yesterday this job was called periodic-tripleo-ci-centos-8-standalone-full-tempest-compute-master but is now split into -scenario- & -api- job) fails with trace like:

        Traceback (most recent call last):
          File "/usr/lib/python3.6/site-packages/tempest/common/utils/__init__.py", line 108, in wrapper
            return func(*func_args, **func_kwargs)
          File "/usr/lib/python3.6/site-packages/tempest/common/utils/__init__.py", line 89, in wrapper
            return f(*func_args, **func_kwargs)
          File "/usr/lib/python3.6/site-packages/tempest/scenario/test_security_groups_basic_ops.py", line 648, in test_port_security_disable_security_group
            dest=self._get_server_ip(server))
          File "/usr/lib/python3.6/site-packages/tempest/scenario/manager.py", line 1113, in check_remote_connectivity
            self.fail(msg)
          File "/usr/lib/python3.6/site-packages/unittest2/case.py", line 693, in fail
            raise self.failureException(msg)
        AssertionError: Timed out waiting for 10.100.0.12 to become reachable from 192.168.24.103

The ssh is happening successfully but ping to different networks are failing.

[1] https://logserver.rdoproject.org/openstack-component-compute/opendev.org/openstack/tripleo-ci/master/periodic-tripleo-ci-centos-8-standalone-full-tempest-scenario-compute-master/45c677d/logs/undercloud/var/log/tempest/stestr_results.html.gz
[2] https://logserver.rdoproject.org/openstack-component-compute/opendev.org/openstack/tripleo-ci/master/periodic-tripleo-ci-centos-8-standalone-full-tempest-compute-master/4eddbe5/logs/undercloud/var/log/tempest/stestr_results.html.gz

Changed in tripleo:
importance: Undecided → Critical
Revision history for this message
chandan kumar (chkumar246) wrote :
Download full text (3.2 KiB)

By looking at this test failure: tempest.scenario.test_security_groups_basic_ops.TestSecurityGroupsBasicOps.test_port_security_disable_security_group

While looking at
https://logserver.rdoproject.org/openstack-component-compute/opendev.org/openstack/tripleo-ci/master/periodic-tripleo-ci-centos-8-standalone-full-tempest-scenario-compute-master/45c677d/logs/undercloud/var/log/containers/neutron/ovn-metadata-agent.log.txt.gz

X-Ovn-Network-Id: 1704776f-06cd-4ed4-81e1-5c06b5d5556a __call__ /usr/lib/python3.6/site-packages/neutron/agent/ovn/metadata/server.py:65
2020-03-26 08:38:29.957 114596 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.6/site-packages/neutron/agent/ovn/metadata/server.py:131
2020-03-26 08:38:29.958 114596 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200 len: 146 time: 0.9985003
2020-03-26 08:38:30.497 114596 DEBUG eventlet.wsgi.server [-] (114596) accepted '' server /usr/lib/python3.6/site-packages/eventlet/wsgi.py:985
2020-03-26 08:38:30.498 114596 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Accept: */*
Connection: close
Content-Type: text/plain
Host: 169.254.169.254
User-Agent: curl/7.42.1

The connection is getting closed.

I am not sure it is related with this ovn issue:
https://logserver.rdoproject.org/openstack-component-compute/opendev.org/openstack/tripleo-ci/master/periodic-tripleo-ci-centos-8-standalone-full-tempest-scenario-compute-master/45c677d/logs/undercloud/var/log/containers/neutron/server.log.txt.gz

2020-03-26 09:06:19.166 28 ERROR futurist.periodics [req-e05a7aba-d875-466c-944d-c5cdb35dea5a - - - - -] Failed to call periodic 'neutron.plugins.ml2.drivers.ovn.mech_driver.ovsdb.maintenance.DBInconsistenciesPeriodics.check_for_ha_chassis_group_address' (it runs every 600.00 seconds): ValueError: max() arg is an empty sequence
2020-03-26 09:06:19.166 28 ERROR futurist.periodics Traceback (most recent call last):
2020-03-26 09:06:19.166 28 ERROR futurist.periodics File "/usr/lib/python3.6/site-packages/futurist/periodics.py", line 293, in run
2020-03-26 09:06:19.166 28 ERROR futurist.periodics work()
2020-03-26 09:06:19.166 28 ERROR futurist.periodics File "/usr/lib/python3.6/site-packages/futurist/periodics.py", line 67, in __call__
2020-03-26 09:06:19.166 28 ERROR futurist.periodics return self.callback(*self.args, **self.kwargs)
2020-03-26 09:06:19.166 28 ERROR futurist.periodics File "/usr/lib/python3.6/site-packages/futurist/periodics.py", line 181, in decorator
2020-03-26 09:06:19.166 28 ERROR futurist.periodics return f(*args, **kwargs)
2020-03-26 09:06:19.166 28 ERROR futurist.periodics File "/usr/lib/python3.6/site-packages/neutron/plugins/ml2/drivers/ovn/mech_driver/ovsdb/maintenance.py", line 576, in check_for_ha_chassis_group_address
2020-03-26 09:06:19.166 28 ERROR futurist.periodics high_prio_ch = max(default_ch_grp.ha_chassis, key=lambda x: x.priority)
2020-03-26 09:06:19.166 28 ERROR futurist.periodics ValueError: max() arg is an empty sequence
2020-03-26 09:06:19.166 28 ERROR futurist.periodics

Need to take ...

Read more...

Revision history for this message
chandan kumar (chkumar246) wrote :

Apart from that, https://logserver.rdoproject.org/openstack-component-compute/opendev.org/openstack/tripleo-ci/master/periodic-tripleo-ci-centos-8-standalone-full-tempest-scenario-compute-master/45c677d/logs/undercloud/var/log/containers/nova/nova-novncproxy.log.txt.gz we are seeing

2020-03-26 08:13:30.225 7 CRITICAL nova [-] Unhandled error: TypeError: __init__() got an unexpected keyword argument 'ssl_ciphers'
2020-03-26 08:13:30.225 7 ERROR nova Traceback (most recent call last):
2020-03-26 08:13:30.225 7 ERROR nova File "/usr/bin/nova-novncproxy", line 10, in <module>
2020-03-26 08:13:30.225 7 ERROR nova sys.exit(main())
2020-03-26 08:13:30.225 7 ERROR nova File "/usr/lib/python3.6/site-packages/nova/cmd/novncproxy.py", line 49, in main
2020-03-26 08:13:30.225 7 ERROR nova security_proxy=security_proxy)
2020-03-26 08:13:30.225 7 ERROR nova File "/usr/lib/python3.6/site-packages/nova/cmd/baseproxy.py", line 83, in proxy
2020-03-26 08:13:30.225 7 ERROR nova security_proxy=security_proxy,
2020-03-26 08:13:30.225 7 ERROR nova File "/usr/lib/python3.6/site-packages/nova/console/websocketproxy.py", line 313, in __init__
2020-03-26 08:13:30.225 7 ERROR nova super(NovaWebSocketProxy, self).__init__(*args, **kwargs)
2020-03-26 08:13:30.225 7 ERROR nova File "/usr/lib/python3.6/site-packages/websockify/websocketproxy.py", line 265, in __init__
2020-03-26 08:13:30.225 7 ERROR nova websocket.WebSocketServer.__init__(self, RequestHandlerClass, *args, **kwargs)
2020-03-26 08:13:30.225 7 ERROR nova TypeError: __init__() got an unexpected keyword argument 'ssl_ciphers'

Not sure about above two issues are inter related and causing the ssh timeout.

Revision history for this message
chandan kumar (chkumar246) wrote :
Download full text (7.4 KiB)

https://logserver.rdoproject.org/openstack-component-compute/opendev.org/openstack/tripleo-ci/master/periodic-tripleo-ci-centos-8-standalone-full-tempest-scenario-compute-master/45c677d/logs/undercloud/var/log/extra/errors.txt.txt.gz

Not it is also linked:
2020-03-26 08:15:15.914 ERROR /var/log/containers/neutron/server.log.1: 28 ERROR ovsdbapp.backend.ovs_idl.idlutils [-] Unable to open stream to tcp:192.168.24.1:6641 to retrieve schema: Connection refused
2020-03-26 08:15:15.959 ERROR /var/log/containers/neutron/server.log.1: 27 ERROR ovsdbapp.backend.ovs_idl.idlutils [req-eed64f1b-e710-4597-af40-ce34ce5d0a47 - - - - -] Unable to open stream to tcp:192.168.24.1:6641 to retrieve schema: Connection refused
2020-03-26 08:15:15.959 ERROR /var/log/containers/neutron/server.log.1: 26 ERROR ovsdbapp.backend.ovs_idl.idlutils [req-96b04752-2e80-4af3-a3dc-98452e778ba7 - - - - -] Unable to open stream to tcp:192.168.24.1:6641 to retrieve schema: Connection refused
2020-03-26 08:15:15.960 ERROR /var/log/containers/neutron/server.log.1: 25 ERROR ovsdbapp.backend.ovs_idl.idlutils [req-db2f8845-8c7a-4ca4-ab8f-b130e0c61710 - - - - -] Unable to open stream to tcp:192.168.24.1:6641 to retrieve schema: Connection refused
2020-03-26 08:15:15.966 ERROR /var/log/containers/neutron/server.log.1: 29 ERROR ovsdbapp.backend.ovs_idl.idlutils [req-5ef99c6d-aac6-4247-abbc-68109a4a89f8 - - - - -] Unable to open stream to tcp:192.168.24.1:6641 to retrieve schema: Connection refused
2020-03-26 08:15:16.917 ERROR /var/log/containers/neutron/server.log.1: 28 ERROR ovsdbapp.backend.ovs_idl.idlutils [-] Unable to open stream to tcp:192.168.24.1:6641 to retrieve schema: Connection refused
2020-03-26 08:15:16.962 ERROR /var/log/containers/neutron/server.log.1: 27 ERROR ovsdbapp.backend.ovs_idl.idlutils [req-eed64f1b-e710-4597-af40-ce34ce5d0a47 - - - - -] Unable to open stream to tcp:192.168.24.1:6641 to retrieve schema: Connection refused
2020-03-26 08:15:16.962 ERROR /var/log/containers/neutron/server.log.1: 26 ERROR ovsdbapp.backend.ovs_idl.idlutils [req-96b04752-2e80-4af3-a3dc-98452e778ba7 - - - - -] Unable to open stream to tcp:192.168.24.1:6641 to retrieve schema: Connection refused
2020-03-26 08:15:16.963 ERROR /var/log/containers/neutron/server.log.1: 25 ERROR ovsdbapp.backend.ovs_idl.idlutils [req-db2f8845-8c7a-4ca4-ab8f-b130e0c61710 - - - - -] Unable to open stream to tcp:192.168.24.1:6641 to retrieve schema: Connection refused
2020-03-26 08:15:16.969 ERROR /var/log/containers/neutron/server.log.1: 29 ERROR ovsdbapp.backend.ovs_idl.idlutils [req-5ef99c6d-aac6-4247-abbc-68109a4a89f8 - - - - -] Unable to open stream to tcp:192.168.24.1:6641 to retrieve schema: Connection refused
2020-03-26 08:15:18.922 ERROR /var/log/containers/neutron/server.log.1: 28 ERROR ovsdbapp.backend.ovs_idl.idlutils [-] Unable to open stream to tcp:192.168.24.1:6641 to retrieve schema: Connection refused
2020-03-26 08:15:18.965 ERROR /var/log/containers/neutron/server.log.1: 26 ERROR ovsdbapp.backend.ovs_idl.idlutils [req-96b04752-2e80-4af3-a3dc-98452e778ba7 - - - - -] Unable to open stream to tcp:192.168.24.1:6641 to retrieve schema: Connection refused
2020-03-26 08:15:18.965 ERROR /var/log/co...

Read more...

description: updated
Revision history for this message
chandan kumar (chkumar246) wrote :

Edit:
* ovn metadata logs are ok. as server booted up, ssh working fine.
* Ping to different network is not working. (Not sure why it is happening)
* nova-novncproxy error is being fixed by https://review.rdoproject.org/r/#/c/26113/ by updating websockify

wes hayutin (weshayutin)
Changed in tripleo:
milestone: ussuri-3 → ussuri-rc3
wes hayutin (weshayutin)
Changed in tripleo:
status: Triaged → Fix Released
To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.