Some info (and more to follow later). We have caching enabled in Keystone & the middleware layer in all environments, but only Mitaka Keystone in dev and we ONLY see this issue in dev.
It also comes and goes. Yesterday it happened on all nodes from 16:15 to 16:45 then went away. The day before it happened from 17:19 to 20:16 then stopped. Before that it last occurred on July 12.
Some info (and more to follow later). We have caching enabled in Keystone & the middleware layer in all environments, but only Mitaka Keystone in dev and we ONLY see this issue in dev.
It also comes and goes. Yesterday it happened on all nodes from 16:15 to 16:45 then went away. The day before it happened from 17:19 to 20:16 then stopped. Before that it last occurred on July 12.
Here is what I see out of the middleware stack:
2016-08-02 15:58:48.489 144 ERROR nova.api.openstack [req-cb1199ba- a752-481c- 82bb-9dcbb35d53 30 dd8836bbbe4545a a96e11eb2e54251 c9 41c98257b5de46f 3aefe5b7ad23d2f 65 - - -] Caught error: 'interface' local/lib/ python2. 7/site- packages/ nova/api/ openstack/ __init_ _.py", line 134, in __call__ response( self.applicatio n) local/lib/ python2. 7/site- packages/ webob/request. py", line 1317, in send info=False) local/lib/ python2. 7/site- packages/ webob/request. py", line 1281, in call_application self.environ, start_response) local/lib/ python2. 7/site- packages/ webob/dec. py", line 144, in __call__ local/lib/ python2. 7/site- packages/ webob/dec. py", line 130, in __call__ local/lib/ python2. 7/site- packages/ webob/dec. py", line 195, in call_func local/lib/ python2. 7/site- packages/ keystonemiddlew are/auth_ token/_ _init__ .py", line 452, in __call__ request( req) local/lib/ python2. 7/site- packages/ keystonemiddlew are/auth_ token/_ _init__ .py", line 752, in process_request service_ catalog) local/lib/ python2. 7/site- packages/ keystonemiddlew are/auth_ token/_ request. py", line 163, in set_user_headers v2_catalog( catalog) local/lib/ python2. 7/site- packages/ keystonemiddlew are/auth_ token/_ request. py", line 47, in _v3_to_v2_catalog 'interface' ].lower( ) + 'URL'
2016-08-02 15:58:48.489 144 ERROR nova.api.openstack Traceback (most recent call last):
2016-08-02 15:58:48.489 144 ERROR nova.api.openstack File "/venv/
2016-08-02 15:58:48.489 144 ERROR nova.api.openstack return req.get_
2016-08-02 15:58:48.489 144 ERROR nova.api.openstack File "/venv/
2016-08-02 15:58:48.489 144 ERROR nova.api.openstack application, catch_exc_
2016-08-02 15:58:48.489 144 ERROR nova.api.openstack File "/venv/
2016-08-02 15:58:48.489 144 ERROR nova.api.openstack app_iter = application(
2016-08-02 15:58:48.489 144 ERROR nova.api.openstack File "/venv/
2016-08-02 15:58:48.489 144 ERROR nova.api.openstack return resp(environ, start_response)
2016-08-02 15:58:48.489 144 ERROR nova.api.openstack File "/venv/
2016-08-02 15:58:48.489 144 ERROR nova.api.openstack resp = self.call_func(req, *args, **self.kwargs)
2016-08-02 15:58:48.489 144 ERROR nova.api.openstack File "/venv/
2016-08-02 15:58:48.489 144 ERROR nova.api.openstack return self.func(req, *args, **kwargs)
2016-08-02 15:58:48.489 144 ERROR nova.api.openstack File "/venv/
2016-08-02 15:58:48.489 144 ERROR nova.api.openstack response = self.process_
2016-08-02 15:58:48.489 144 ERROR nova.api.openstack File "/venv/
2016-08-02 15:58:48.489 144 ERROR nova.api.openstack self._include_
2016-08-02 15:58:48.489 144 ERROR nova.api.openstack File "/venv/
2016-08-02 15:58:48.489 144 ERROR nova.api.openstack catalog = _v3_to_
2016-08-02 15:58:48.489 144 ERROR nova.api.openstack File "/venv/
2016-08-02 15:58:48.489 144 ERROR nova.api.openstack interface_name = v3_endpoint[
2016-08-02 15:58:48.489 144 ERROR nova.api.openstack KeyError: 'interface'
2016-08-02 15:58:48.489 144 ERROR nova.api.openstack
We see in multiple services which have different versions of the middleware, including Glance, Cinder, Nova, and Designate.