periodic-tripleo-ci-centos-7-ovb-3ctlr_1comp-featureset002-ocata is failing with Resource could not be found on mistral

Bug #1742465 reported by Arx Cruz
6
This bug affects 1 person
Affects Status Importance Assigned to Milestone
tripleo
Fix Released
High
Unassigned

Bug Description

There are several errors on this job like this:

2018-01-10 04:59:32.040 ERROR /var/log/mistral/executor.log: 20877 ERROR mistral.engine.default_executor Traceback (most recent call last):
2018-01-10 04:59:32.040 ERROR /var/log/mistral/executor.log: 20877 ERROR mistral.engine.default_executor File "/usr/lib/python2.7/site-packages/mistral/engine/default_executor.py", line 89, in run_action
2018-01-10 04:59:32.040 ERROR /var/log/mistral/executor.log: 20877 ERROR mistral.engine.default_executor result = action.run()
2018-01-10 04:59:32.040 ERROR /var/log/mistral/executor.log: 20877 ERROR mistral.engine.default_executor File "/usr/lib/python2.7/site-packages/mistral/actions/openstack/base.py", line 142, in run
2018-01-10 04:59:32.040 ERROR /var/log/mistral/executor.log: 20877 ERROR mistral.engine.default_executor (self.__class__.__name__, self.client_method_name, e_str)
2018-01-10 04:59:32.040 ERROR /var/log/mistral/executor.log: 20877 ERROR mistral.engine.default_executor ActionException: HeatAction.stacks.get failed: <class 'heatclient.exc.HTTPNotFound'>: {"explanation": "The resource could not be found.", "code": 404, "error": {"message": "The Stack (overcloud) could not be found.", "traceback": "Traceback (most recent call last):\n\n File \"/usr/lib/python2.7/site-packages/heat/common/context.py\", line 407, in wrapped\n return func(self, ctx, *args, **kwargs)\n\n File \"/usr/lib/python2.7/site-packages/heat/engine/service.py\", line 488, in identify_stack\n raise exception.EntityNotFound(entity='Stack', name=stack_name)\n\nEntityNotFound: The Stack (overcloud) could not be found.\n", "type": "EntityNotFound"}, "title": "Not Found"}
2018-01-10 04:59:32.040 ERROR /var/log/mistral/executor.log: 20877 ERROR mistral.engine.default_executor

logs:
https://logs.rdoproject.org/openstack-periodic-24hr/periodic-tripleo-ci-centos-7-ovb-3ctlr_1comp-featureset002-ocata-upload/cf06cc1/undercloud/var/log/extra/errors.txt.gz

Changed in tripleo:
milestone: none → queens-3
wes hayutin (weshayutin)
Changed in tripleo:
assignee: Arx Cruz (arxcruz) → nobody
Revision history for this message
Adriano Petrich (apetrich) wrote :

This error by itself is not a problem. This error is a bad loging issue we do a get and if it fails we do a create. the problem is that the logs for the failing get are not suppressed and end up polluting the /var/log/mistral

You can see that happening on successful runs (like in this other patch that run successfully http://logs.openstack.org/86/506186/15/check/tripleo-ci-centos-7-containers-multinode/e8a016a/logs/undercloud/var/log/mistral/executor.log.txt.gz#_2018-01-16_15_49_18_011)

the problem is a bit bellow in your logs just after that error in

https://logs.rdoproject.org/openstack-periodic-24hr/periodic-tripleo-ci-centos-7-ovb-3ctlr_1comp-featureset002-ocata-upload/cf06cc1/undercloud/var/log/mistral/executor.log.txt.gz#_2018-01-10_04_59_40_436

you see that it is starting a stack.create but that never finishes we get an status:u'FAILED', u'message': u'The Heat stack is busy' and the logs end.

I think that somehow heat got stuck. Is this consistently failing?

Revision history for this message
Adriano Petrich (apetrich) wrote :

Looking at the heat logs It doesn't seem to be stuck. It seems to be creating a stack around the timestamp 2018-01-10_04_59_40_436 so the issue seems that the logs for mistral ends abruptly

Revision history for this message
Adriano Petrich (apetrich) wrote :
Revision history for this message
Thomas Herve (therve) wrote :
Changed in tripleo:
milestone: queens-3 → queens-rc1
Revision history for this message
wes hayutin (weshayutin) wrote :

Ocata periodic jobs are passing

Changed in tripleo:
status: Triaged → Fix Released
To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.