Keystone needs a test that can verify that across the many changes we do not end up growing our token's data scope without explicit knowledge. The basic reasoning is to ensure that we can keep track of all the elements in a given token version and easily see when that has changed.
Currently there are checks to verify some token data is there, but there are no checks to verify that tokens don't end up with unknown/extra data. There seems to be a lot of duplicate data in the tokens at this point. This duplication of data should be kept to a minimum.
This will also help us to define the token specification and keep that explicit specification up to date. This specification can be used by developers to accurately predict what is guaranteed to be in every token and pull the data from the correct location.
what data are you trying to limit? An unscoepd token can become ascoped, and vice versa, which means that this is already happening. Beyond that we have catalog data in the token.