Stx-openstack apply-fail during provisioning

Bug #1917615 reported by Alexandru Dimofte
6
This bug affects 1 person
Affects Status Importance Assigned to Milestone
StarlingX
New
Critical
Unassigned

Bug Description

Brief Description
-----------------
Stx-openstack apply-fail during provisioning on all configurations (barebetal and virtual) except simplex.

Severity
--------
<Critical: System/Feature is not usable due to the defect>

Steps to Reproduce
------------------
Try to install stx image 20210302T141852Z. During provisioning you'll see sxt-openstack apply-fail.

Expected Behavior
------------------
Stralingx installation should work fine.

Actual Behavior
----------------
Stx-openstack apply-fail:
2021-03-03 11:40:40.909 560 ERROR armada.handlers.armada [-] Chart deploy [openstack-fm-rest-api] failed: armada.exceptions.k8s_exceptions.KubernetesWatchTimeoutException: Timed out waiting for pods (namespace=openstack, labels=(release_group=osh-openstack-fm-rest-api)). These pods were not ready=['fm-rest-api-6f5995558b-qmvxp', 'fm-rest-api-6f5995558b-x4p55']
2021-03-03 11:40:40.909 560 ERROR armada.handlers.armada Traceback (most recent call last):
2021-03-03 11:40:40.909 560 ERROR armada.handlers.armada File "/usr/local/lib/python3.6/dist-packages/armada/handlers/armada.py", line 225, in handle_result
2021-03-03 11:40:40.909 560 ERROR armada.handlers.armada result = get_result()
2021-03-03 11:40:40.909 560 ERROR armada.handlers.armada File "/usr/local/lib/python3.6/dist-packages/armada/handlers/armada.py", line 236, in <lambda>
2021-03-03 11:40:40.909 560 ERROR armada.handlers.armada if (handle_result(chart, lambda: deploy_chart(chart))):
2021-03-03 11:40:40.909 560 ERROR armada.handlers.armada File "/usr/local/lib/python3.6/dist-packages/armada/handlers/armada.py", line 214, in deploy_chart
2021-03-03 11:40:40.909 560 ERROR armada.handlers.armada chart, cg_test_all_charts, prefix, known_releases)
2021-03-03 11:40:40.909 560 ERROR armada.handlers.armada File "/usr/local/lib/python3.6/dist-packages/armada/handlers/chart_deploy.py", line 248, in execute
2021-03-03 11:40:40.909 560 ERROR armada.handlers.armada chart_wait.wait(timer)
2021-03-03 11:40:40.909 560 ERROR armada.handlers.armada File "/usr/local/lib/python3.6/dist-packages/armada/handlers/wait.py", line 134, in wait
2021-03-03 11:40:40.909 560 ERROR armada.handlers.armada wait.wait(timeout=timeout)
2021-03-03 11:40:40.909 560 ERROR armada.handlers.armada File "/usr/local/lib/python3.6/dist-packages/armada/handlers/wait.py", line 294, in wait
2021-03-03 11:40:40.909 560 ERROR armada.handlers.armada modified = self._wait(deadline)
2021-03-03 11:40:40.909 560 ERROR armada.handlers.armada File "/usr/local/lib/python3.6/dist-packages/armada/handlers/wait.py", line 354, in _wait
2021-03-03 11:40:40.909 560 ERROR armada.handlers.armada raise k8s_exceptions.KubernetesWatchTimeoutException(error)
2021-03-03 11:40:40.909 560 ERROR armada.handlers.armada armada.exceptions.k8s_exceptions.KubernetesWatchTimeoutException: Timed out waiting for pods (namespace=openstack, labels=(release_group=osh-openstack-fm-rest-api)). These pods were not ready=['fm-rest-api-6f5995558b-qmvxp', 'fm-rest-api-6f5995558b-x4p55']
2021-03-03 11:40:40.909 560 ERROR armada.handlers.armada
2021-03-03 11:40:40.910 560 ERROR armada.handlers.armada [-] Chart deploy(s) failed: ['openstack-fm-rest-api']
2021-03-03 11:40:41.191 560 INFO armada.handlers.lock [-] Releasing lock
2021-03-03 11:40:41.197 560 ERROR armada.cli [-] Caught internal exception: armada.exceptions.armada_exceptions.ChartDeployException: Exception deploying charts: ['openstack-fm-rest-api']
2021-03-03 11:40:41.197 560 ERROR armada.cli Traceback (most recent call last):
2021-03-03 11:40:41.197 560 ERROR armada.cli File "/usr/local/lib/python3.6/dist-packages/armada/cli/__init__.py", line 38, in safe_invoke
2021-03-03 11:40:41.197 560 ERROR armada.cli self.invoke()
2021-03-03 11:40:41.197 560 ERROR armada.cli File "/usr/local/lib/python3.6/dist-packages/armada/cli/apply.py", line 213, in invoke
2021-03-03 11:40:41.197 560 ERROR armada.cli resp = self.handle(documents, tiller)
2021-03-03 11:40:41.197 560 ERROR armada.cli File "/usr/local/lib/python3.6/dist-packages/armada/handlers/lock.py", line 81, in func_wrapper
2021-03-03 11:40:41.197 560 ERROR armada.cli return future.result()
2021-03-03 11:40:41.197 560 ERROR armada.cli File "/usr/lib/python3.6/concurrent/futures/_base.py", line 425, in result
2021-03-03 11:40:41.197 560 ERROR armada.cli return self.__get_result()
2021-03-03 11:40:41.197 560 ERROR armada.cli File "/usr/lib/python3.6/concurrent/futures/_base.py", line 384, in __get_result
2021-03-03 11:40:41.197 560 ERROR armada.cli raise self._exception
2021-03-03 11:40:41.197 560 ERROR armada.cli File "/usr/lib/python3.6/concurrent/futures/thread.py", line 56, in run
2021-03-03 11:40:41.197 560 ERROR armada.cli result = self.fn(*self.args, **self.kwargs)
2021-03-03 11:40:41.197 560 ERROR armada.cli File "/usr/local/lib/python3.6/dist-packages/armada/cli/apply.py", line 256, in handle
2021-03-03 11:40:41.197 560 ERROR armada.cli return armada.sync()
2021-03-03 11:40:41.197 560 ERROR armada.cli File "/usr/local/lib/python3.6/dist-packages/armada/handlers/armada.py", line 252, in sync
2021-03-03 11:40:41.197 560 ERROR armada.cli raise armada_exceptions.ChartDeployException(failures)
2021-03-03 11:40:41.197 560 ERROR armada.cli armada.exceptions.armada_exceptions.ChartDeployException: Exception deploying charts: ['openstack-fm-rest-api']
2021-03-03 11:40:41.197 560 ERROR armada.cli
command terminated with exit code 1
[sysadmin@controller-0 ~(keystone_admin)]$

Reproducibility
---------------
100% reproducible

System Configuration
--------------------
Two node system, Multi-node system, Dedicated storage, on baremetal and virtual.

Branch/Pull Time/Commit
-----------------------
master

Last Pass
---------
20210226T024233Z

Timestamp/Logs
--------------
logs will be attached

Test Activity
-------------
Sanity

Workaround
----------
-

Revision history for this message
Alexandru Dimofte (adimofte) wrote :
Revision history for this message
Ghada Khalil (gkhalil) wrote :

stx.5.0 / critical - recent sanity issue

Changed in starlingx:
importance: Undecided → Critical
tags: added: stx.5.0 stx.apps stx.distro.openstack
Revision history for this message
Dan Voiculeasa (dvoicule) wrote :

# pod pulling this version
Pulling image "registry.local:9001/docker.io/starlingx/stx-fm-rest-api:master-centos-stable-20210226T034144Z.0"

Was the last pass also run with stx-openstack pointing to 20210226T034144Z.0 tag?

Error reported by fm-rest-api pod:
#openstack_fm-rest-api-6f5995558b-qmvxp_0afec687-58d9-4c4a-8b48-bb7b3f2b15ff/fm-rest-api/26.log
2021-03-03T11:51:03.926738599Z stderr F ++ sed -n 's/^sql_connection=//p' /etc/fm/fm.conf
2021-03-03T11:51:03.928252459Z stderr F + export SQL_CONNECTION=
2021-03-03T11:51:03.928264243Z stderr F + SQL_CONNECTION=
2021-03-03T11:51:03.928268334Z stderr F + echo
2021-03-03T11:51:03.928273629Z stderr F + python /usr/local/bin/fm_db_sync_event_suppression.py
2021-03-03T11:51:05.049433572Z stderr F Postgres credentials required as argument.

Revision history for this message
Alexandru Dimofte (adimofte) wrote :

The last green build was: 20210226T024233Z/ , it used : stx-fm-rest-api:master-centos-stable-20210223T150134Z.0

The first red build was: 20210227T023307Z/ , it used: stx-fm-rest-api:master-centos-stable-20210226T034144Z.0
this was affected by: https://bugs.launchpad.net/starlingx/+bug/1917308

The build from yesterday was: 20210302T141852Z/ it used: stx-fm-rest-api:master-centos-stable-20210226T034144Z.0
and it is RED, it fails during provisioning trying to apply stx-openstack and this is why I filled in this bug.

Revision history for this message
Alexandru Dimofte (adimofte) wrote :

Latest StarlingX image 20210303T033720Z/ from 3rd March, used: stx-fm-rest-api:master-centos-stable-20210302T192758Z.0 . The current problem regarding stx-openstack apply-fail during provisioning is not visible for this image. However the image is still affected by https://bugs.launchpad.net/starlingx/+bug/1917308.

Revision history for this message
Dan Voiculeasa (dvoicule) wrote :
To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.