I'm facing this issue in an Ussuri deployment that has been upgraded from Queens (going through each intermediate openstack version in between). I'm running charms: - cs:keystone-323 - cs:openstack-dashboard-313 I'm wondering if in my case it could be linked to the relation data as this is what I see: $ juju run --application openstack-dashboard "relation-get -r identity-service:116 - keystone/0" - Stdout: | admin_domain_id: c7f2b8aaedbb4fb0a128ebb4ee03f030 api_version: "3" auth_host: keystone-int.acme.com auth_port: "35357" auth_protocol: https egress-subnets: 10.123.45.6/32 ingress-address: 10.123.45.6 private-address: 10.123.45.6 region: RegionOne service_host: keystone.acme.com service_port: "5000" service_protocol: https UnitId: openstack-dashboard/0 - Stdout: | admin_domain_id: c7f2b8aaedbb4fb0a128ebb4ee03f030 api_version: "3" auth_host: keystone-int.acme.com auth_port: "35357" auth_protocol: https egress-subnets: 10.123.45.6/32 ingress-address: 10.123.45.6 private-address: 10.123.45.6 region: RegionOne service_host: keystone.acme.com service_port: "5000" service_protocol: https UnitId: openstack-dashboard/2 - Stdout: | admin_domain_id: c7f2b8aaedbb4fb0a128ebb4ee03f030 api_version: "3" auth_host: keystone-int.acme.com auth_port: "35357" auth_protocol: https egress-subnets: 10.123.45.6/32 ingress-address: 10.123.45.6 private-address: 10.123.45.6 region: RegionOne service_host: keystone.acme.com service_port: "5000" service_protocol: https UnitId: openstack-dashboard/3 $ juju run --application openstack-dashboard "relation-get -r identity-service:116 - keystone/1" - Stdout: | admin_domain_id: c7f2b8aaedbb4fb0a128ebb4ee03f030 api_version: "3" auth_host: keystone-int.acme.com auth_port: "35357" auth_protocol: https created_roles: Member egress-subnets: 10.123.45.7/32 ingress-address: 10.123.45.7 internal_host: keystone-int.acme.com internal_port: "5000" internal_protocol: https private-address: 10.123.45.7 region: RegionOne service_host: keystone.acme.com service_port: "5000" service_protocol: https UnitId: openstack-dashboard/0 - Stdout: | admin_domain_id: c7f2b8aaedbb4fb0a128ebb4ee03f030 api_version: "3" auth_host: keystone-int.acme.com auth_port: "35357" auth_protocol: https created_roles: Member egress-subnets: 10.123.45.7/32 ingress-address: 10.123.45.7 internal_host: keystone-int.acme.com internal_port: "5000" internal_protocol: https private-address: 10.123.45.7 region: RegionOne service_host: keystone.acme.com service_port: "5000" service_protocol: https UnitId: openstack-dashboard/2 - Stdout: | admin_domain_id: c7f2b8aaedbb4fb0a128ebb4ee03f030 api_version: "3" auth_host: keystone-int.acme.com auth_port: "35357" auth_protocol: https created_roles: Member egress-subnets: 10.123.45.7/32 ingress-address: 10.123.45.7 internal_host: keystone-int.acme.com internal_port: "5000" internal_protocol: https private-address: 10.123.45.7 region: RegionOne service_host: keystone.acme.com service_port: "5000" service_protocol: https UnitId: openstack-dashboard/3 $ juju run --application openstack-dashboard "relation-get -r identity-service:116 - keystone/2" - Stdout: | egress-subnets: 10.123.45.8/32 ingress-address: 10.123.45.8 private-address: 10.123.45.8 UnitId: openstack-dashboard/0 - Stdout: | egress-subnets: 10.123.45.8/32 ingress-address: 10.123.45.8 private-address: 10.123.45.8 UnitId: openstack-dashboard/2 - Stdout: | egress-subnets: 10.123.45.8/32 ingress-address: 10.123.45.8 private-address: 10.123.45.8 UnitId: openstack-dashboard/3 The relation data is different for each keystone unit. Only the relation with the keystone leader unit (keystone-1) has the "created_roles" key. If I compare with another environment which underwent the same upgrade, I see the same relation data for all keystone units and all contain the "created_roles" key. Should the relation data be identical for all 3 keystone units? Is that why I'm facing the issue? How can I fix this and why would this have happened in this environment but not the other?