Removing duplicated items doesn't work in case of federations
Affects | Status | Importance | Assigned to | Milestone | |
---|---|---|---|---|---|
OpenStack Identity (keystone) |
Fix Released
|
Medium
|
Lance Bragstad |
Bug Description
In commit eed233cac8f34ce
This mechanism can lead to situation when there's duplication of objects, so for such cases code for filtering out duplicates was added.
It was impemented in the following way:
domains = [dict(t) for t in set([tuple(
where domains is a list of dicts, each of which contains information about appropriate domain. This code can work fine in some situations but in general can work in a wrong way because dict "items" method returns key-value pairs in arbitrary order according to https:/
This code was introduced in upstream Thu Feb 25 21:39:15 2016, so it seems that this code remains in newton and ocata and master branch.
tags: | added: federation |
Changed in keystone: | |
assignee: | nobody → Dmitry Stepanenko (dstepanenko) |
description: | updated |
Changed in keystone: | |
status: | Invalid → New |
Changed in keystone: | |
status: | New → In Progress |
Changed in keystone: | |
milestone: | none → pike-rc2 |
The change referenced in the description [0] includes tests for duplicate projects via direct and indirect assignments [1]. This should be testing the code in question.
I ran the tests locally and ensured there were't duplicate projects or domains as a result of the API. Are you able to recreate this with a test of some kind?
[0] https:/ /review. openstack. org/#/c/ 284943/ 63 /github. com/openstack/ keystone/ blob/bebd7056ad 33d294871013067 cb7367bc6db1a13 /keystone/ tests/unit/ test_v3_ federation. py#L3069- L3132
[1] https:/