test_cross_tenant_traffic tempest tests failed with Timed out waiting for 10.0.0.112 to become reachable from 10.0.0.118 in fs020 master

Bug #1860673 reported by chandan kumar on 2020-01-23
6
This bug affects 1 person
Affects Status Importance Assigned to Milestone
tripleo
Critical
Unassigned

Bug Description

FS020 master periodic job failed at tempest.scenario.test_security_groups_basic_ops.TestSecurityGroupsBasicOps.test_cross_tenant_traffic[compute,id-e79f879e-debb-440c-a7e4-efeda05b6848,network] with Timed out waiting for 10.0.0.112 to become reachable from 10.0.0.118

From logs:
http://logs.rdoproject.org/openstack-periodic-master/opendev.org/openstack/tripleo-ci/master/periodic-tripleo-ci-centos-7-ovb-1ctlr_2comp-featureset020-master/9a5146d/logs/undercloud/var/log/tempest/tempest_run.log

tempest.scenario.test_security_groups_basic_ops.TestSecurityGroupsBasicOps.test_cross_tenant_traffic[compute,id-e79f879e-debb-440c-a7e4-efeda05b6848,network]
-------------------------------------------------------------------------------------------------------------------------------------------------------------

Captured traceback:
~~~~~~~~~~~~~~~~~~~
    Traceback (most recent call last):
      File "/usr/lib/python2.7/site-packages/tempest/common/utils/__init__.py", line 89, in wrapper
        return f(*func_args, **func_kwargs)
      File "/usr/lib/python2.7/site-packages/tempest/scenario/test_security_groups_basic_ops.py", line 508, in test_cross_tenant_traffic
        self._test_cross_tenant_allow(source_tenant, dest_tenant, ruleset)
      File "/usr/lib/python2.7/site-packages/tempest/scenario/test_security_groups_basic_ops.py", line 424, in _test_cross_tenant_allow
        self.check_remote_connectivity(access_point_ssh, ip, protocol=protocol)
      File "/usr/lib/python2.7/site-packages/tempest/scenario/manager.py", line 1078, in check_remote_connectivity
        self.fail(msg)
      File "/usr/lib/python2.7/site-packages/unittest2/case.py", line 690, in fail
        raise self.failureException(msg)
    AssertionError: Timed out waiting for 10.0.0.112 to become reachable from 10.0.0.118

While Looking at http://logs.rdoproject.org/openstack-periodic-master/opendev.org/openstack/tripleo-ci/master/periodic-tripleo-ci-centos-7-ovb-1ctlr_2comp-featureset020-master/9a5146d/logs/overcloud-controller-0/var/log/extra/errors.txt.txt

2020-01-23 05:55:06.062 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR neutron.plugins.ml2.managers [req-d2fb5a44-2941-480a-bf2d-5a97a183d66b - - - - -] Mechanism driver 'ovn' failed in update_port_postcommit: StandardAttributeIDNotFound: Standard attribute ID not found for 3d2cb17c-e10d-4551-a2b9-963dfc8f03c3
2020-01-23 05:55:06.062 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR neutron.plugins.ml2.managers Traceback (most recent call last):
2020-01-23 05:55:06.062 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR neutron.plugins.ml2.managers File "/usr/lib/python2.7/site-packages/neutron/plugins/ml2/managers.py", line 475, in _call_on_drivers
2020-01-23 05:55:06.062 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR neutron.plugins.ml2.managers getattr(driver.obj, method_name)(context)
2020-01-23 05:55:06.062 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR neutron.plugins.ml2.managers File "/usr/lib/python2.7/site-packages/networking_ovn/ml2/mech_driver.py", line 599, in update_port_postcommit
2020-01-23 05:55:06.062 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR neutron.plugins.ml2.managers self._ovn_client.update_port(port, port_object=original_port)
2020-01-23 05:55:06.062 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR neutron.plugins.ml2.managers File "/usr/lib/python2.7/site-packages/networking_ovn/common/ovn_client.py", line 697, in update_port
2020-01-23 05:55:06.062 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR neutron.plugins.ml2.managers db_rev.bump_revision(port, ovn_const.TYPE_PORTS)
2020-01-23 05:55:06.062 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR neutron.plugins.ml2.managers File "/usr/lib/python2.7/site-packages/oslo_db/api.py", line 154, in wrapper
2020-01-23 05:55:06.062 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR neutron.plugins.ml2.managers ectxt.value = e.inner_exc
2020-01-23 05:55:06.062 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR neutron.plugins.ml2.managers File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 220, in __exit__
2020-01-23 05:55:06.062 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR neutron.plugins.ml2.managers self.force_reraise()
2020-01-23 05:55:06.062 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR neutron.plugins.ml2.managers File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 196, in force_reraise
2020-01-23 05:55:06.062 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR neutron.plugins.ml2.managers six.reraise(self.type_, self.value, self.tb)
2020-01-23 05:55:06.062 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR neutron.plugins.ml2.managers File "/usr/lib/python2.7/site-packages/oslo_db/api.py", line 142, in wrapper
2020-01-23 05:55:06.062 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR neutron.plugins.ml2.managers return f(*args, **kwargs)
2020-01-23 05:55:06.062 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR neutron.plugins.ml2.managers File "/usr/lib/python2.7/site-packages/networking_ovn/db/revision.py", line 126, in bump_revision
2020-01-23 05:55:06.062 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR neutron.plugins.ml2.managers session, resource['id'], resource_type)
2020-01-23 05:55:06.062 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR neutron.plugins.ml2.managers File "/usr/lib/python2.7/site-packages/networking_ovn/db/revision.py", line 51, in _get_standard_attr_id
2020-01-23 05:55:06.062 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR neutron.plugins.ml2.managers resource_uuid=resource_uuid)
2020-01-23 05:55:06.062 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR neutron.plugins.ml2.managers StandardAttributeIDNotFound: Standard attribute ID not found for 3d2cb17c-e10d-4551-a2b9-963dfc8f03c3
2020-01-23 05:55:06.062 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR neutron.plugins.ml2.managers
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event [req-d2fb5a44-2941-480a-bf2d-5a97a183d66b - - - - -] Unexpected exception in notify_loop: MechanismDriverError
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event Traceback (most recent call last):
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event File "/usr/lib/python2.7/site-packages/ovsdbapp/event.py", line 143, in notify_loop
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event match.run(event, row, updates)
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event File "/usr/lib/python2.7/site-packages/networking_ovn/ovsdb/ovsdb_monitor.py", line 159, in run
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event router, host)
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event File "/usr/lib/python2.7/site-packages/networking_ovn/l3/l3_ovn.py", line 308, in update_router_gateway_port_bindings
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event {'port': {portbindings.HOST_ID: host}})
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event File "/usr/lib/python2.7/site-packages/neutron/common/utils.py", line 685, in inner
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event return f(self, context, *args, **kwargs)
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event File "/usr/lib/python2.7/site-packages/neutron_lib/db/api.py", line 233, in wrapped
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event return method(*args, **kwargs)
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event File "/usr/lib/python2.7/site-packages/neutron_lib/db/api.py", line 139, in wrapped
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event setattr(e, '_RETRY_EXCEEDED', True)
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 220, in __exit__
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event self.force_reraise()
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 196, in force_reraise
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event six.reraise(self.type_, self.value, self.tb)
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event File "/usr/lib/python2.7/site-packages/neutron_lib/db/api.py", line 135, in wrapped
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event return f(*args, **kwargs)
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event File "/usr/lib/python2.7/site-packages/oslo_db/api.py", line 154, in wrapper
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event ectxt.value = e.inner_exc
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 220, in __exit__
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event self.force_reraise()
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 196, in force_reraise
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event six.reraise(self.type_, self.value, self.tb)
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event File "/usr/lib/python2.7/site-packages/oslo_db/api.py", line 142, in wrapper
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event return f(*args, **kwargs)
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event File "/usr/lib/python2.7/site-packages/neutron_lib/db/api.py", line 183, in wrapped
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event LOG.debug("Retry wrapper got retriable exception: %s", e)
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 220, in __exit__
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event self.force_reraise()
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 196, in force_reraise
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event six.reraise(self.type_, self.value, self.tb)
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event File "/usr/lib/python2.7/site-packages/neutron_lib/db/api.py", line 179, in wrapped
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event return f(*dup_args, **dup_kwargs)
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event File "/usr/lib/python2.7/site-packages/neutron/plugins/ml2/plugin.py", line 1792, in update_port
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event need_notify=need_port_update_notify)
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event File "/usr/lib/python2.7/site-packages/neutron_lib/db/api.py", line 139, in wrapped
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event setattr(e, '_RETRY_EXCEEDED', True)
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 220, in __exit__
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event self.force_reraise()
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 196, in force_reraise
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event six.reraise(self.type_, self.value, self.tb)
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event File "/usr/lib/python2.7/site-packages/neutron_lib/db/api.py", line 135, in wrapped
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event return f(*args, **kwargs)
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event File "/usr/lib/python2.7/site-packages/oslo_db/api.py", line 154, in wrapper
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event ectxt.value = e.inner_exc
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 220, in __exit__
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event self.force_reraise()
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 196, in force_reraise
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event six.reraise(self.type_, self.value, self.tb)
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event File "/usr/lib/python2.7/site-packages/oslo_db/api.py", line 142, in wrapper
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event return f(*args, **kwargs)
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event File "/usr/lib/python2.7/site-packages/neutron_lib/db/api.py", line 183, in wrapped
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event LOG.debug("Retry wrapper got retriable exception: %s", e)
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 220, in __exit__
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event self.force_reraise()
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 196, in force_reraise
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event six.reraise(self.type_, self.value, self.tb)
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event File "/usr/lib/python2.7/site-packages/neutron_lib/db/api.py", line 179, in wrapped
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event return f(*dup_args, **dup_kwargs)
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event File "/usr/lib/python2.7/site-packages/neutron/plugins/ml2/plugin.py", line 520, in _bind_port_if_needed
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event need_notify))
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event File "/usr/lib/python2.7/site-packages/neutron/plugins/ml2/plugin.py", line 728, in _commit_port_binding
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event self.mechanism_manager.update_port_postcommit(cur_context)
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event File "/usr/lib/python2.7/site-packages/neutron/plugins/ml2/managers.py", line 743, in update_port_postcommit
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event continue_on_failure=True)
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event File "/usr/lib/python2.7/site-packages/neutron/plugins/ml2/managers.py", line 493, in _call_on_drivers
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event errors=errors
2020-01-23 05:55:06.073 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR ovsdbapp.event MechanismDriverError

We are looking to that what caused timedout.

chandan kumar (chkumar246) wrote :

Re-running the job here https://review.rdoproject.org/r/24647 as it might appears to be transient

tags: removed: promotion-blocker

Hi,

I took a look at this error but, it seems like a random failure.

The traceback in the bug doesn't match the timestamp of the test failure, see. The last SSH attempt happened at:

2020-01-23 05:39:27,667 330915 ERROR [tempest.lib.common.utils.linux.remote_client] (TestSecurityGroupsBasicOps:test_cross_tenant_traffic) Executing command on 10.0.0.118 failed. Error: Command 'set -eu -o pipefail; PATH=$PATH:/sbin:/usr/sbin; ping -c1 -w1 -s56 10.0.0.112' failed, exit status: 1, stderr:

Where the error traceback happened at:

2020-01-23 05:55:06.062 ERROR /var/log/containers/neutron/server.log.1: 34 ERROR neutron.plugins.ml2.managers StandardAttributeIDNotFound: Standard attribute ID not found for 3d2cb17c-e10d-4551-a2b9-963dfc8f03c3

~15m after... So I don't think it's related.

I've also deployed devstack in a VM and attempted to run this test a couple of times and it's been passing for me:

{0} tempest.scenario.test_security_groups_basic_ops.TestSecurityGroupsBasicOps.test_cross_tenant_traffic [134.334584s] ... ok

...

Can we retry this job and see ? If it fails again I will try to do a deeper analysis of what may be going on there.

chandan kumar (chkumar246) wrote :

Another hit by the same issue: http://logs.rdoproject.org/openstack-periodic-master/opendev.org/openstack/tripleo-ci/master/periodic-tripleo-ci-centos-7-ovb-1ctlr_2comp-featureset020-master/851c265/logs/undercloud/var/log/tempest/tempest_run.log

tempest.scenario.test_security_groups_basic_ops.TestSecurityGroupsBasicOps.test_cross_tenant_traffic[compute,id-e79f879e-debb-440c-a7e4-efeda05b6848,network]
-------------------------------------------------------------------------------------------------------------------------------------------------------------

Captured traceback:
~~~~~~~~~~~~~~~~~~~
    Traceback (most recent call last):
      File "/usr/lib/python2.7/site-packages/tempest/common/utils/__init__.py", line 89, in wrapper
        return f(*func_args, **func_kwargs)
      File "/usr/lib/python2.7/site-packages/tempest/scenario/test_security_groups_basic_ops.py", line 508, in test_cross_tenant_traffic
        self._test_cross_tenant_allow(source_tenant, dest_tenant, ruleset)
      File "/usr/lib/python2.7/site-packages/tempest/scenario/test_security_groups_basic_ops.py", line 424, in _test_cross_tenant_allow
        self.check_remote_connectivity(access_point_ssh, ip, protocol=protocol)
      File "/usr/lib/python2.7/site-packages/tempest/scenario/manager.py", line 1078, in check_remote_connectivity
        self.fail(msg)
      File "/usr/lib/python2.7/site-packages/unittest2/case.py", line 690, in fail
        raise self.failureException(msg)
    AssertionError: Timed out waiting for 10.0.0.116 to become reachable from 10.0.0.102

Re-running again in ci

wes hayutin (weshayutin) on 2020-02-10
Changed in tripleo:
milestone: ussuri-2 → ussuri-3
wes hayutin (weshayutin) wrote :
tags: removed: alert
To post a comment you must log in.
This report contains Public information  Edit
Everyone can see this information.

Other bug subscribers