Using python2 clients when keystone rocky is python3

Bug #1806111 reported by David Ames
42
This bug affects 8 people
Affects Status Importance Assigned to Milestone
OpenStack Cinder Charm
Fix Released
Undecided
Frode Nordahl
OpenStack Keystone Charm
Fix Released
Critical
Frode Nordahl
OpenStack Keystone LDAP integration
Fix Released
Critical
Frode Nordahl

Bug Description

The keystone-ldap charm attempts to use python2 clients which are not installed when the primary keystone is deployed in Rocky.

Revision history for this message
David Ames (thedac) wrote :

Also this is related to https://bugs.launchpad.net/keystone/+bug/1798184

The fix for the charm will still be dependent on the fix to Bug #1798184

Changed in charm-keystone-ldap:
status: New → Triaged
importance: Undecided → High
assignee: nobody → David Ames (thedac)
milestone: none → 19.04
Revision history for this message
OpenStack Infra (hudson-openstack) wrote : Fix proposed to charm-keystone-ldap (master)

Fix proposed to branch: master
Review: https://review.openstack.org/621275

Changed in charm-keystone-ldap:
status: Triaged → In Progress
Revision history for this message
Frode Nordahl (fnordahl) wrote :

Note that resolution of this bug depends on fix and SRU of upstream bug 1798184 and bug 1820333

Revision history for this message
OpenStack Infra (hudson-openstack) wrote : Fix proposed to charm-keystone-ldap (stable/18.11)

Fix proposed to branch: stable/18.11
Review: https://review.openstack.org/645075

Revision history for this message
OpenStack Infra (hudson-openstack) wrote : Fix merged to charm-keystone-ldap (master)

Reviewed: https://review.openstack.org/621275
Committed: https://git.openstack.org/cgit/openstack/charm-keystone-ldap/commit/?id=4163fdcbeefbe197204ae06177e81d776baa548e
Submitter: Zuul
Branch: master

commit 4163fdcbeefbe197204ae06177e81d776baa548e
Author: David Ames <email address hidden>
Date: Fri Nov 30 11:28:05 2018 -0800

    Enable Rocky and python3

    When the primary keystone is deployed with rocky and python3 the charm
    fails to install the correct python3 packages and use the correct
    clients.

    Note: A related bug #1798184, will cause the tests to fail. A complete
    Rocky python3 solution is dependent on #1798184 being resolved.

    Change-Id: I42d8a5bfff3200d18e7bad0bd29edf12aa6a05c7
    Closes-Bug: #1806111

Changed in charm-keystone-ldap:
status: In Progress → Fix Committed
Revision history for this message
OpenStack Infra (hudson-openstack) wrote : Fix merged to charm-keystone-ldap (stable/18.11)

Reviewed: https://review.openstack.org/645075
Committed: https://git.openstack.org/cgit/openstack/charm-keystone-ldap/commit/?id=84fc12c09555afd23d84b41363693379157aaa53
Submitter: Zuul
Branch: stable/18.11

commit 84fc12c09555afd23d84b41363693379157aaa53
Author: David Ames <email address hidden>
Date: Fri Nov 30 11:28:05 2018 -0800

    Enable Rocky and python3

    When the primary keystone is deployed with rocky and python3 the charm
    fails to install the correct python3 packages and use the correct
    clients.

    Note: A related bug #1798184, will cause the tests to fail. A complete
    Rocky python3 solution is dependent on #1798184 being resolved.

    Change-Id: I42d8a5bfff3200d18e7bad0bd29edf12aa6a05c7
    Closes-Bug: #1806111

Frode Nordahl (fnordahl)
Changed in charm-keystone-ldap:
status: Fix Committed → Fix Released
Revision history for this message
Drew Freiberger (afreiberger) wrote :
Download full text (15.1 KiB)

I think this is still an open issue.

When upgrading keystone with keystone-ldap from bionic distro to cloud:bionic-rocky, the keystone/leader unit goes into error state not able to reach the proxy manager:

unit-keystone-0: 19:38:43 ERROR unit.keystone/0.juju-log The call within manager.py failed with the error: 'An unexpected error prevented the server from fulfilling your request. (HTTP 500) (Request-ID: req-c57e4eb3-7d3c-4580-8667-f2324e6d92a9)'. The call was: path=['resolve_domain_id'], args=('default',), kwargs={}, api_version=None
unit-keystone-0: 19:38:43 DEBUG unit.keystone/0.config-changed An unexpected error prevented the server from fulfilling your request. (HTTP 500) (Request-ID: req-c57e4eb3-7d3c-4580-8667-f2324e6d92a9)
unit-keystone-0: 19:38:43 INFO unit.keystone/0.juju-log Retrying '_proxy_manager_call' 5 more times (delay=3)
unit-keystone-0: 19:38:47 ERROR unit.keystone/0.juju-log The call within manager.py failed with the error: 'An unexpected error prevented the server from fulfilling your request. (HTTP 500) (Request-ID: req-60aa9ecd-56b6-489e-ab3b-4fe0d4c96059)'. The call was: path=['resolve_domain_id'], args=('default',), kwargs={}, api_version=None
unit-keystone-0: 19:38:47 DEBUG unit.keystone/0.config-changed An unexpected error prevented the server from fulfilling your request. (HTTP 500) (Request-ID: req-60aa9ecd-56b6-489e-ab3b-4fe0d4c96059)
unit-keystone-0: 19:38:47 INFO unit.keystone/0.juju-log Retrying '_proxy_manager_call' 4 more times (delay=6)
unit-keystone-0: 19:38:53 ERROR unit.keystone/0.juju-log The call within manager.py failed with the error: 'An unexpected error prevented the server from fulfilling your request. (HTTP 500) (Request-ID: req-9842ec31-848e-4e9c-a924-81174fed4ab6)'. The call was: path=['resolve_domain_id'], args=('default',), kwargs={}, api_version=None
unit-keystone-0: 19:38:53 DEBUG unit.keystone/0.config-changed An unexpected error prevented the server from fulfilling your request. (HTTP 500) (Request-ID: req-9842ec31-848e-4e9c-a924-81174fed4ab6)
unit-keystone-0: 19:38:54 INFO unit.keystone/0.juju-log Retrying '_proxy_manager_call' 3 more times (delay=9)

unit-keystone-0: 19:39:03 ERROR unit.keystone/0.juju-log The call within manager.py failed with the error: 'An unexpected error prevented the server from fulfilling your request. (HTTP 500) (Request-ID: req-e3d06cd2-b4c6-48bf-a61e-36a9a5edf806)'. The call was: path=['resolve_domain_id'], args=('default',), kwargs={}, api_version=None
unit-keystone-0: 19:39:03 DEBUG unit.keystone/0.config-changed An unexpected error prevented the server from fulfilling your request. (HTTP 500) (Request-ID: req-e3d06cd2-b4c6-48bf-a61e-36a9a5edf806)
unit-keystone-0: 19:39:03 INFO unit.keystone/0.juju-log Retrying '_proxy_manager_call' 2 more times (delay=12)
unit-keystone-0: 19:39:16 ERROR unit.keystone/0.juju-log The call within manager.py failed with the error: 'An unexpected error prevented the server from fulfilling your request. (HTTP 500) (Request-ID: req-311c11ac-a7c1-47bb-842b-ef9f8721544e)'. The call was: path=['resolve_domain_id'], args=('default',), kwargs={}, api_version=None
unit-keystone-0: 19:39:16 DEBUG unit.keystone/0.config-changed An unex...

Revision history for this message
Drew Freiberger (afreiberger) wrote :

To summarize my last comment, when upgrading a cloud that has keystone-ldap deployed as a subordinate to keystone from bionic-queens to bionic-rocky with the command:

juju config keystone openstack-origin=cloud:bionic-rocky

The above errors are seen. I believe that keystone openstack-upgrade will need to check if keystone-ldap is a subordinate, or if the packages are installed, and install python3-ldappool so that when keystone is restarted on the openstack-upgraded version, the ldap plugin doesn't fail to load.

workaround is to also run the command "juju run -a keystone 'apt-get install -y python3-ldappool".

Going to unsub field-crit and sub field-medium, as this affects LTS upgrades from bionic-queens to focal-ussuri.

Frode Nordahl (fnordahl)
Changed in charm-keystone:
status: New → In Progress
importance: Undecided → High
assignee: nobody → Frode Nordahl (fnordahl)
Frode Nordahl (fnordahl)
Changed in charm-keystone:
milestone: none → 21.04
Changed in charm-keystone-ldap:
status: Fix Released → In Progress
assignee: David Ames (thedac) → Frode Nordahl (fnordahl)
milestone: 19.04 → 21.04
tags: added: openstack-upgrade
Revision history for this message
Frode Nordahl (fnordahl) wrote :

Drew, thank you for bringing this problem to our attention.

This is a problem we would see in multiple principal-subordinate plugin type relations, cinder storage backends comes to mind as another area where this might be a problem.

It is particularly visible in the python2 - python3 migration, but I would not be surprised if we hit breaking upgrade changes at other stages in the future too.

What we would like to do is implement general helpers that would allow subordinates to register a map of OpenStack release: install/purge packages pairs with their principal charm (thanks to Liam Young for that idea). With this in place the principal charm can perform the necessary package actions in-place at upgrade time, without having to risk charm relation RPC type processing while in this critical step of an upgrade process.

After the package upgrade the principal will put a notification on the subordinate relation allowing the subordinate charm to perform any necessary post-upgrade tasks.

Revision history for this message
Frode Nordahl (fnordahl) wrote :
Revision history for this message
Billy Olsen (billy-olsen) wrote :

Based on the comment #8, there is a work-around identified (juju run -a keystone 'apt-get install -y python3-ldappool) and this was reduced to field-medium. This is classified as both field-critical and field-high. Since a work-around is available, I am removing the field-critical. I will keep it as field-high rather than matching the comment from Drew, since this is more critical than medium.

Changed in charm-keystone-ldap:
importance: High → Critical
Changed in charm-keystone:
importance: High → Critical
Changed in charm-keystone-ldap:
status: In Progress → Fix Committed
Changed in charm-keystone:
status: In Progress → Fix Committed
Changed in charm-keystone:
status: Fix Committed → In Progress
Changed in charm-keystone:
status: In Progress → Fix Committed
Changed in charm-keystone-ldap:
status: Fix Committed → Fix Released
Changed in charm-keystone:
status: Fix Committed → Fix Released
Revision history for this message
OpenStack Infra (hudson-openstack) wrote : Related fix merged to charm-cinder (master)

Reviewed: https://review.opendev.org/c/openstack/charm-cinder/+/782787
Committed: https://opendev.org/openstack/charm-cinder/commit/3d043ae144b86071033101f2b0119e90ff526f7a
Submitter: "Zuul (22348)"
Branch: master

commit 3d043ae144b86071033101f2b0119e90ff526f7a
Author: Aurelien Lourot <email address hidden>
Date: Tue Sep 28 10:44:26 2021 +0200

    Process subordinate releases packages map

    For principal - subordinate plugin type relations where the
    principal Python payload imports code from packages managed by a
    subordinate, upgrades can be problematic.

    This change will allow a subordinate charm that have opted into the
    feature to inform its principal about all implemented release -
    packages combinations ahead of time. With this information in place
    the principal can do the upgrade in one operation without risk of
    charm relation RPC type processing at a critical moment.

    Related-Bug: #1806111
    Change-Id: Ic8ea4fe6109081814045adea7ce6688b3564c9e5
    Co-authored-by: Aurelien Lourot <email address hidden>

Revision history for this message
Felipe Reyes (freyes) wrote :

To get cinder to to manage the upgrade from python2 to python3 in environments where purestorage is in use the following patches were needed:

https://review.opendev.org/c/openstack/charm-cinder-purestorage/+/782762
https://review.opendev.org/c/openstack/charms.openstack/+/781487
https://review.opendev.org/c/openstack/charm-interface-cinder-backend/+/782761

All 3 of them merged, so I'm marking the task for cinder as "fix committed".

Changed in charm-cinder:
status: New → Fix Committed
assignee: nobody → Frode Nordahl (fnordahl)
Changed in charm-cinder:
milestone: none → 21.10
Changed in charm-cinder:
status: Fix Committed → Fix Released
To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.