[stable/train] Creation of the QoS policy takes ages

Bug #1898748 reported by Slawek Kaplonski
8
This bug affects 1 person
Affects Status Importance Assigned to Milestone
neutron
Expired
Critical
Unassigned

Bug Description

It seems that on stable/train job openstacksdk-functional-devstack is failing due to "inprogress" test:

openstack.tests.functional.cloud.test_qos_policy.TestQosPolicy.test_create_qos_policy_default
or
openstack.tests.functional.cloud.test_qos_policy.TestQosPolicy.test_create_qos_policy_basic

Failure example https://storage.bhs.cloud.ovh.net/v1/AUTH_dcaab5e32b234d56b626f72581e3644c/zuul_opendev_logs_82a/755958/1/check/openstacksdk-functional-devstack/82a83f0/job-output.txt

tags: added: api
description: updated
Revision history for this message
Rodolfo Alonso (rodolfo-alonso-hernandez) wrote :

Hello:

This is not only Neutron but all OpenStack services and agents that stop working. In the logs provided [1], reviewing the rabbitmq logs [2], we can see how all services stop connecting to the MQ. There is a hiatus from 9:06 to 9:13.

I still don't know why this is happening, I don't see anything in the logs. Actually, the logs jump from 9:06 to 9:13 without any reason.

The system is still working [3] and there are no resource consumption spikes (at least I don't see any).

Regards.

[1]https://storage.bhs.cloud.ovh.net/v1/AUTH_dcaab5e32b234d56b626f72581e3644c/zuul_opendev_logs_82a/755958/1/check/openstacksdk-functional-devstack/82a83f0/job-output.txt
[2]https://storage.bhs.cloud.ovh.net/v1/AUTH_dcaab5e32b234d56b626f72581e3644c/zuul_opendev_logs_82a/755958/1/check/openstacksdk-functional-devstack/82a83f0/controller/logs/rabbitmq/rabbit%40ubuntu-bionic-vexxhost-ca-ymq-1-0020274479_log.txt
[3]https://storage.bhs.cloud.ovh.net/v1/AUTH_dcaab5e32b234d56b626f72581e3644c/zuul_opendev_logs_82a/755958/1/check/openstacksdk-functional-devstack/82a83f0/controller/logs/dstat-csv_log.txt

Revision history for this message
Dr. Jens Harbott (j-harbott) wrote :

There's an OOM happening at 9:13 in the above logs, maybe this is related to the reduction of swap space. IIRC Octavia has been seeing similar issues and made a patch to increase swap size again.

What is unclear to me is why this only seems to be happening on stable/train and not for other branches.

Changed in neutron:
status: New → Confirmed
Revision history for this message
Slawek Kaplonski (slaweq) wrote :

Is this still an issue? If yes, I think it should be moved to some other project, something infra related as it don't seems like Neutron issue really.

Changed in neutron:
status: Confirmed → Incomplete
Revision history for this message
Launchpad Janitor (janitor) wrote :

[Expired for neutron because there has been no activity for 60 days.]

Changed in neutron:
status: Incomplete → Expired
To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.