DC subcloud B&R restore fails pulling Ceph images

Bug #1896272 reported by Frank Miller
8
This bug affects 1 person
Affects Status Importance Assigned to Milestone
StarlingX
Fix Released
Medium
Ovidiu Poncea

Bug Description

Brief Description
-----------------
On a distributed cloud system using NetApp, if a B&R is done on a system controller followed by a B&R on a subcloud, the ceph images will fail to be pulled on the subcloud.

Severity
--------
Major

Steps to Reproduce
------------------
B&R on system controller
B&R on subcloud

Expected Behavior
------------------
All pods on the subcloud should recover after the restore is complete

Actual Behavior
----------------
CEPH pods do not recover on the subcloud and are in ImagePullBackOff state.

Reproducibility
---------------
Reproducible

System Configuration
--------------------
Distributed Cloud where NetApp is the storage backend on the system controller and ceph is used on the subclouds.

Branch/Pull Time/Commit
-----------------------
Master branch

Last Pass
---------
n/a

Timestamp/Logs
--------------
n/a

Test Activity
-------------
B&R testing

Workaround
----------
Images have to be manually added back to the system controller docker registry.

Frank Miller (sensfan22)
Changed in starlingx:
assignee: nobody → Ovidiu Poncea (ovidiu.poncea)
status: New → Triaged
importance: Undecided → Medium
tags: added: stx.5.0 stx.distcloud
Changed in starlingx:
status: Triaged → In Progress
Revision history for this message
OpenStack Infra (hudson-openstack) wrote : Fix proposed to config (master)

Fix proposed to branch: master
Review: https://review.opendev.org/752804

Revision history for this message
OpenStack Infra (hudson-openstack) wrote : Fix proposed to ansible-playbooks (master)

Fix proposed to branch: master
Review: https://review.opendev.org/752805

Revision history for this message
OpenStack Infra (hudson-openstack) wrote : Fix merged to config (master)

Reviewed: https://review.opendev.org/752804
Committed: https://git.openstack.org/cgit/starlingx/config/commit/?id=e6c4bcc7d7ad2b1d88cb0c06d842345e983fb18d
Submitter: Zuul
Branch: master

commit e6c4bcc7d7ad2b1d88cb0c06d842345e983fb18d
Author: Ovidiu Poncea <email address hidden>
Date: Fri Sep 18 21:14:19 2020 +0300

    Back up local registry images for platform-integ-apps on DC central

    Rbd-provisioner images are not backed up on a DC central w/o Ceph
    even when they are added to additional_local_registry_images.
    This means that on restore of DC central they are no longer present
    in registry.local. This causes subclouds with Ceph enabled to fail
    restore (or apply) as they pull rbd-provisioner from central.

    This commit fixes this.

    Change-Id: I12310f694db6b53881646d1a3ee61b2a6f833d1b
    Closes-Bug: 1896272
    Signed-off-by: Ovidiu Poncea <email address hidden>

Changed in starlingx:
status: In Progress → Fix Released
Revision history for this message
OpenStack Infra (hudson-openstack) wrote : Fix merged to ansible-playbooks (master)

Reviewed: https://review.opendev.org/752805
Committed: https://git.openstack.org/cgit/starlingx/ansible-playbooks/commit/?id=1464e57f176d43bd348bdca1efffbe5e99c16de9
Submitter: Zuul
Branch: master

commit 1464e57f176d43bd348bdca1efffbe5e99c16de9
Author: Ovidiu Poncea <email address hidden>
Date: Fri Sep 18 21:07:33 2020 +0300

    Back up local registry images for platform-integ-apps on DC central

    Rbd-provisioner images are not backed up on a DC central w/o Ceph even
    when they are added to additional_local_registry_images. This means
    that on restore of DC central they are no longer present in
    registry.local. This causes subclouds with Ceph enabled to fail restore
    (or apply) as they pull rbd-provisioner from central.

    This commit fixes this.

    Change-Id: Iabc10f694db6b53881646d1a3ee61b2a6f833d1a
    Closes-Bug: 1896272
    Depends-On: I12310f694db6b53881646d1a3ee61b2a6f833d1b
    Signed-off-by: Ovidiu Poncea <email address hidden>

To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.