Activity log for bug #1812935

Date Who What changed Old value New value Message
2019-01-22 23:06:15 Michele Baldessari bug added bug
2019-01-30 13:39:50 Herve Beraud bug watch added https://github.com/containers/libpod/issues/2240
2019-01-30 15:01:31 Ben Nemec oslo.cache: status New Confirmed
2019-01-30 15:01:35 Ben Nemec oslo.cache: importance Undecided High
2019-02-01 09:21:15 Herve Beraud oslo.cache: assignee Herve Beraud (herveberaud)
2019-02-01 17:07:32 OpenStack Infra oslo.cache: status Confirmed In Progress
2019-04-09 14:02:49 James Page bug task added python-oslo.cache (Ubuntu)
2019-04-09 14:02:56 James Page nominated for series Ubuntu Disco
2019-04-09 14:02:56 James Page bug task added python-oslo.cache (Ubuntu Disco)
2019-04-09 14:02:56 James Page nominated for series Ubuntu Cosmic
2019-04-09 14:02:56 James Page bug task added python-oslo.cache (Ubuntu Cosmic)
2019-04-09 14:03:06 James Page python-oslo.cache (Ubuntu Cosmic): status New Triaged
2019-04-09 14:03:08 James Page python-oslo.cache (Ubuntu Disco): status New Triaged
2019-04-09 14:03:10 James Page python-oslo.cache (Ubuntu Cosmic): importance Undecided High
2019-04-09 14:03:11 James Page python-oslo.cache (Ubuntu Disco): importance Undecided High
2019-04-09 14:03:21 James Page bug added subscriber Ubuntu Stable Release Updates Team
2019-04-09 14:29:02 Corey Bryant bug task added cloud-archive
2019-04-09 14:30:01 Corey Bryant nominated for series cloud-archive/rocky
2019-04-09 14:30:01 Corey Bryant bug task added cloud-archive/rocky
2019-04-09 14:30:01 Corey Bryant nominated for series cloud-archive/stein
2019-04-09 14:30:01 Corey Bryant bug task added cloud-archive/stein
2019-04-09 14:30:10 Corey Bryant cloud-archive/rocky: status New Triaged
2019-04-09 14:30:12 Corey Bryant cloud-archive/stein: status New Triaged
2019-04-09 14:30:14 Corey Bryant cloud-archive/rocky: importance Undecided High
2019-04-09 14:30:16 Corey Bryant cloud-archive/stein: importance Undecided High
2019-04-09 14:33:17 Corey Bryant cloud-archive/stein: status Triaged Fix Released
2019-04-09 14:35:32 James Page python-oslo.cache (Ubuntu Disco): status Triaged Fix Released
2019-04-09 14:37:31 Jason Hobbs tags cdo-qa cdo-release-blocker foundations-engine
2019-04-09 14:37:57 Corey Bryant python-oslo.cache (Ubuntu Disco): status Fix Released In Progress
2019-04-09 14:38:00 Corey Bryant python-oslo.cache (Ubuntu Disco): status In Progress Fix Released
2019-04-09 14:58:47 Corey Bryant description nova conductor running on a rhel8 host inside f28 based containers hits the following error: 2019-01-17 13:59:37.049 46 DEBUG oslo_concurrency.lockutils [req-284f3071-8eee-4dcb-903c-838f2e024b48 40ca1490773f49f791d3a834af3702c8 8671bdf05abf48f58a9bdcdb0ef4b740 - default default] Lock "b051d003-482d-4cb7-810c-8d256e6c879e" acquired by "nova.context.set_target_cell.<locals>.get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3.6/site-packages/oslo_concurrency/lockutils.py:327 2019-01-17 13:59:37.050 46 DEBUG oslo_concurrency.lockutils [req-284f3071-8eee-4dcb-903c-838f2e024b48 40ca1490773f49f791d3a834af3702c8 8671bdf05abf48f58a9bdcdb0ef4b740 - default default] Lock "b051d003-482d-4cb7-810c-8d256e6c879e" released by "nova.context.set_target_cell.<locals>.get_or_set_cached_cell_and_set_connections" :: held 0.002s inner /usr/lib/python3.6/site-packages/oslo_concurrency/lockutils.py:339 2019-01-17 13:59:37.060 46 DEBUG oslo_db.sqlalchemy.engines [req-284f3071-8eee-4dcb-903c-838f2e024b48 40ca1490773f49f791d3a834af3702c8 8671bdf05abf48f58a9bdcdb0ef4b740 - default default] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3.6/site-packages/oslo_db/sqlalchemy/engines.py:307 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server [req-284f3071-8eee-4dcb-903c-838f2e024b48 40ca1490773f49f791d3a834af3702c8 8671bdf05abf48f58a9bdcdb0ef4b740 - default default] Exception during message handling: TypeError: object() takes no parameters 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server Traceback (most recent call last): 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_cache/_memcache_pool.py", line 163, in _get 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server conn = self.queue.pop().connection 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server IndexError: pop from an empty deque 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server Traceback (most recent call last): 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_messaging/rpc/server.py", line 166, in _process_incoming 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_messaging/rpc/dispatcher.py", line 265, in dispatch 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_messaging/rpc/dispatcher.py", line 194, in _do_dispatch 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/nova/conductor/manager.py", line 1303, in schedule_and_build_instances 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server instance.create() 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_versionedobjects/base.py", line 226, in wrapper 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server return fn(self, *args, **kwargs) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/nova/objects/instance.py", line 607, in create 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server self._load_ec2_ids() 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/nova/objects/instance.py", line 983, in _load_ec2_ids 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server self.ec2_ids = objects.EC2Ids.get_by_instance(self._context, self) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_versionedobjects/base.py", line 184, in wrapper 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server result = fn(cls, context, *args, **kwargs) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/nova/objects/ec2.py", line 216, in get_by_instance 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server ec2_ids = cls._get_ec2_ids(context, instance) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/nova/objects/ec2.py", line 200, in _get_ec2_ids 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server ec2_ids['instance_id'] = ec2utils.id_to_ec2_inst_id(instance.uuid) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/nova/api/ec2/ec2utils.py", line 188, in id_to_ec2_inst_id 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server int_id = get_int_id_from_instance_uuid(ctxt, instance_id) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/nova/api/ec2/ec2utils.py", line 47, in memoizer 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server value = _CACHE.get(key) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/nova/cache_utils.py", line 107, in get 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server value = self.region.get(key) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/dogpile/cache/region.py", line 645, in get 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server value = self.backend.get(key) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/dogpile/cache/backends/memcached.py", line 161, in get 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server value = self.client.get(key) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_cache/backends/memcache_pool.py", line 31, in _run_method 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server with self.client_pool.acquire() as client: 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib64/python3.6/contextlib.py", line 81, in __enter__ 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server return next(self.gen) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_cache/_memcache_pool.py", line 128, in acquire 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server conn = self.get(timeout=self._connection_get_timeout) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/eventlet/queue.py", line 303, in get 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server return self._get() 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_cache/_memcache_pool.py", line 214, in _get 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server conn = ConnectionPool._get(self) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_cache/_memcache_pool.py", line 165, in _get 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server conn = self._create_connection() 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_cache/_memcache_pool.py", line 206, in _create_connection 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server return _MemcacheClient(self.urls, **self._arguments) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server TypeError: object() takes no parameters 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server nova.conf has: ()[nova@standalone /]$ grep -v ^# /etc/nova/nova.conf |grep '[a-zA-Z]' |grep -i mem backend=oslo_cache.memcache_pool memcache_servers=192.168.24.2:11211 memcached_servers=192.168.24.2:11211 memcache seems to be up: [root@standalone ~]# podman top memcached USER PID PPID %CPU ELAPSED TTY TIME COMMAND memcached 1 0 0.000 2h2m27.45050385s ? 0s dumb-init --single-child -- /bin/bash -c source /etc/sysconfig/memcached; /usr/bin/memcached -p ${PORT} -u ${USER} -m ${CACHESIZE} -c ${MAXCONN} $OPTIONS memcached 8 1 0.000 2h2m27.450722828s ? 0s /bin/bash -c source /etc/sysconfig/memcached; /usr/bin/memcached -p ${PORT} -u ${USER} -m ${CACHESIZE} -c ${MAXCONN} $OPTIONS memcached 9 8 0.000 2h2m27.450781466s ? 0s /usr/bin/memcached -p 11211 -u memcached -m 11970 -c 8192 -v -l 192.168.24.2 -U 0 -X -t 8 >> /var/log/memcached.log 2>&1 Workaround: disable caching in nova crudini --set /var/lib/config-data/puppet-generated/nova/etc/nova/nova.conf cache enabled false podman restart nova_conductor Versions inside the f28 nova-conductor container: python-oslo-versionedobjects-lang-1.34.1-0.20181128123056.50474ad.fc28.noarch python3-oslo-config-6.7.0-0.20181108120643.64e020a.fc28.noarch python3-oslo-cache-1.32.0-0.20190108083242.eb68d73.fc28.noarch python3-oslo-versionedobjects-1.34.1-0.20181128123056.50474ad.fc28.noarch python-oslo-middleware-lang-3.37.0-0.20181211135004.a609e68.fc28.noarch puppet-oslo-14.2.0-0.20190111032249.b937844.fc28.noarch python3-oslo-middleware-3.37.0-0.20181211135004.a609e68.fc28.noarch python3-oslo-service-1.34.0-0.20190114140259.d987a4a.fc28.noarch python3-oslo-policy-1.44.0-0.20190108082943.c9ea8f7.fc28.noarch python-oslo-policy-lang-1.44.0-0.20190108082943.c9ea8f7.fc28.noarch python-oslo-concurrency-lang-3.29.0-0.20181128184857.0767ddf.fc28.noarch python3-oslo-utils-3.39.0-0.20190110184625.3823707.fc28.noarch python3-oslo-vmware-2.32.1-0.20181126101324.04f82ae.fc28.noarch python-oslo-log-lang-3.42.2-0.20190114115634.1babd44.fc28.noarch python3-oslo-rootwrap-5.15.1-0.20181226111925.27f2314.fc28.noarch python3-oslo-context-2.22.0-0.20190114134914.f65408d.fc28.noarch python3-oslo-privsep-1.30.1-0.20181127103633.9391cbf.fc28.noarch python-oslo-utils-lang-3.39.0-0.20190110184625.3823707.fc28.noarch python-oslo-db-lang-4.43.0-0.20190108081838.a6d2cc5.fc28.noarch python3-memcached-1.58-5.fc29.noarch python-oslo-cache-lang-1.32.0-0.20190108083242.eb68d73.fc28.noarch python3-oslo-i18n-3.23.0-0.20190111201133.a5fde9a.fc28.noarch python3-oslo-concurrency-3.29.0-0.20181128184857.0767ddf.fc28.noarch python3-oslo-db-4.43.0-0.20190108081838.a6d2cc5.fc28.noarch python3-oslo-reports-1.29.1-0.20181126102912.dde49a4.fc28.noarch python-oslo-i18n-lang-3.23.0-0.20190111201133.a5fde9a.fc28.noarch python3-oslo-messaging-9.3.0-0.20190114135848.13fa4f5.fc28.noarch python3-oslo-serialization-2.28.1-0.20181001122254.0371c1d.fc28.noarch python-oslo-vmware-lang-2.32.1-0.20181126101324.04f82ae.fc28.noarch python-oslo-privsep-lang-1.30.1-0.20181127103633.9391cbf.fc28.noarch python3-oslo-log-3.42.2-0.20190114115634.1babd44.fc28.noarch puppet-memcached-3.3.0-0.20180803162752.e517b44.fc28.noarch nova conductor running on a rhel8 host inside f28 based containers hits the following error: 2019-01-17 13:59:37.049 46 DEBUG oslo_concurrency.lockutils [req-284f3071-8eee-4dcb-903c-838f2e024b48 40ca1490773f49f791d3a834af3702c8 8671bdf05abf48f58a9bdcdb0ef4b740 - default default] Lock "b051d003-482d-4cb7-810c-8d256e6c879e" acquired by "nova.context.set_target_cell.<locals>.get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3.6/site-packages/oslo_concurrency/lockutils.py:327 2019-01-17 13:59:37.050 46 DEBUG oslo_concurrency.lockutils [req-284f3071-8eee-4dcb-903c-838f2e024b48 40ca1490773f49f791d3a834af3702c8 8671bdf05abf48f58a9bdcdb0ef4b740 - default default] Lock "b051d003-482d-4cb7-810c-8d256e6c879e" released by "nova.context.set_target_cell.<locals>.get_or_set_cached_cell_and_set_connections" :: held 0.002s inner /usr/lib/python3.6/site-packages/oslo_concurrency/lockutils.py:339 2019-01-17 13:59:37.060 46 DEBUG oslo_db.sqlalchemy.engines [req-284f3071-8eee-4dcb-903c-838f2e024b48 40ca1490773f49f791d3a834af3702c8 8671bdf05abf48f58a9bdcdb0ef4b740 - default default] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3.6/site-packages/oslo_db/sqlalchemy/engines.py:307 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server [req-284f3071-8eee-4dcb-903c-838f2e024b48 40ca1490773f49f791d3a834af3702c8 8671bdf05abf48f58a9bdcdb0ef4b740 - default default] Exception during message handling: TypeError: object() takes no parameters 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server Traceback (most recent call last): 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_cache/_memcache_pool.py", line 163, in _get 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server conn = self.queue.pop().connection 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server IndexError: pop from an empty deque 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server Traceback (most recent call last): 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_messaging/rpc/server.py", line 166, in _process_incoming 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_messaging/rpc/dispatcher.py", line 265, in dispatch 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_messaging/rpc/dispatcher.py", line 194, in _do_dispatch 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/nova/conductor/manager.py", line 1303, in schedule_and_build_instances 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server instance.create() 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_versionedobjects/base.py", line 226, in wrapper 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server return fn(self, *args, **kwargs) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/nova/objects/instance.py", line 607, in create 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server self._load_ec2_ids() 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/nova/objects/instance.py", line 983, in _load_ec2_ids 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server self.ec2_ids = objects.EC2Ids.get_by_instance(self._context, self) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_versionedobjects/base.py", line 184, in wrapper 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server result = fn(cls, context, *args, **kwargs) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/nova/objects/ec2.py", line 216, in get_by_instance 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server ec2_ids = cls._get_ec2_ids(context, instance) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/nova/objects/ec2.py", line 200, in _get_ec2_ids 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server ec2_ids['instance_id'] = ec2utils.id_to_ec2_inst_id(instance.uuid) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/nova/api/ec2/ec2utils.py", line 188, in id_to_ec2_inst_id 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server int_id = get_int_id_from_instance_uuid(ctxt, instance_id) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/nova/api/ec2/ec2utils.py", line 47, in memoizer 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server value = _CACHE.get(key) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/nova/cache_utils.py", line 107, in get 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server value = self.region.get(key) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/dogpile/cache/region.py", line 645, in get 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server value = self.backend.get(key) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/dogpile/cache/backends/memcached.py", line 161, in get 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server value = self.client.get(key) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_cache/backends/memcache_pool.py", line 31, in _run_method 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server with self.client_pool.acquire() as client: 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib64/python3.6/contextlib.py", line 81, in __enter__ 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server return next(self.gen) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_cache/_memcache_pool.py", line 128, in acquire 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server conn = self.get(timeout=self._connection_get_timeout) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/eventlet/queue.py", line 303, in get 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server return self._get() 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_cache/_memcache_pool.py", line 214, in _get 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server conn = ConnectionPool._get(self) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_cache/_memcache_pool.py", line 165, in _get 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server conn = self._create_connection() 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_cache/_memcache_pool.py", line 206, in _create_connection 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server return _MemcacheClient(self.urls, **self._arguments) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server TypeError: object() takes no parameters 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server nova.conf has: ()[nova@standalone /]$ grep -v ^# /etc/nova/nova.conf |grep '[a-zA-Z]' |grep -i mem backend=oslo_cache.memcache_pool memcache_servers=192.168.24.2:11211 memcached_servers=192.168.24.2:11211 memcache seems to be up: [root@standalone ~]# podman top memcached USER PID PPID %CPU ELAPSED TTY TIME COMMAND memcached 1 0 0.000 2h2m27.45050385s ? 0s dumb-init --single-child -- /bin/bash -c source /etc/sysconfig/memcached; /usr/bin/memcached -p ${PORT} -u ${USER} -m ${CACHESIZE} -c ${MAXCONN} $OPTIONS memcached 8 1 0.000 2h2m27.450722828s ? 0s /bin/bash -c source /etc/sysconfig/memcached; /usr/bin/memcached -p ${PORT} -u ${USER} -m ${CACHESIZE} -c ${MAXCONN} $OPTIONS memcached 9 8 0.000 2h2m27.450781466s ? 0s /usr/bin/memcached -p 11211 -u memcached -m 11970 -c 8192 -v -l 192.168.24.2 -U 0 -X -t 8 >> /var/log/memcached.log 2>&1 Workaround: disable caching in nova crudini --set /var/lib/config-data/puppet-generated/nova/etc/nova/nova.conf cache enabled false podman restart nova_conductor Versions inside the f28 nova-conductor container: python-oslo-versionedobjects-lang-1.34.1-0.20181128123056.50474ad.fc28.noarch python3-oslo-config-6.7.0-0.20181108120643.64e020a.fc28.noarch python3-oslo-cache-1.32.0-0.20190108083242.eb68d73.fc28.noarch python3-oslo-versionedobjects-1.34.1-0.20181128123056.50474ad.fc28.noarch python-oslo-middleware-lang-3.37.0-0.20181211135004.a609e68.fc28.noarch puppet-oslo-14.2.0-0.20190111032249.b937844.fc28.noarch python3-oslo-middleware-3.37.0-0.20181211135004.a609e68.fc28.noarch python3-oslo-service-1.34.0-0.20190114140259.d987a4a.fc28.noarch python3-oslo-policy-1.44.0-0.20190108082943.c9ea8f7.fc28.noarch python-oslo-policy-lang-1.44.0-0.20190108082943.c9ea8f7.fc28.noarch python-oslo-concurrency-lang-3.29.0-0.20181128184857.0767ddf.fc28.noarch python3-oslo-utils-3.39.0-0.20190110184625.3823707.fc28.noarch python3-oslo-vmware-2.32.1-0.20181126101324.04f82ae.fc28.noarch python-oslo-log-lang-3.42.2-0.20190114115634.1babd44.fc28.noarch python3-oslo-rootwrap-5.15.1-0.20181226111925.27f2314.fc28.noarch python3-oslo-context-2.22.0-0.20190114134914.f65408d.fc28.noarch python3-oslo-privsep-1.30.1-0.20181127103633.9391cbf.fc28.noarch python-oslo-utils-lang-3.39.0-0.20190110184625.3823707.fc28.noarch python-oslo-db-lang-4.43.0-0.20190108081838.a6d2cc5.fc28.noarch python3-memcached-1.58-5.fc29.noarch python-oslo-cache-lang-1.32.0-0.20190108083242.eb68d73.fc28.noarch python3-oslo-i18n-3.23.0-0.20190111201133.a5fde9a.fc28.noarch python3-oslo-concurrency-3.29.0-0.20181128184857.0767ddf.fc28.noarch python3-oslo-db-4.43.0-0.20190108081838.a6d2cc5.fc28.noarch python3-oslo-reports-1.29.1-0.20181126102912.dde49a4.fc28.noarch python-oslo-i18n-lang-3.23.0-0.20190111201133.a5fde9a.fc28.noarch python3-oslo-messaging-9.3.0-0.20190114135848.13fa4f5.fc28.noarch python3-oslo-serialization-2.28.1-0.20181001122254.0371c1d.fc28.noarch python-oslo-vmware-lang-2.32.1-0.20181126101324.04f82ae.fc28.noarch python-oslo-privsep-lang-1.30.1-0.20181127103633.9391cbf.fc28.noarch python3-oslo-log-3.42.2-0.20190114115634.1babd44.fc28.noarch puppet-memcached-3.3.0-0.20180803162752.e517b44.fc28.noarch ------------------------------------------------ Ubuntu SRU details ------------------ [Impact] See description above. [Test Case] [Regression Potential] The regression potential is low. This is a minimal fix that has successfully been reviewed upstream and passed all upstream gate tests. It has already landed in upstream master branch and Ubuntu Disco and received 3 +1's and zuul tests +1 on stable/rocky gerrit reviews.
2019-04-09 15:03:28 Jason Hobbs description nova conductor running on a rhel8 host inside f28 based containers hits the following error: 2019-01-17 13:59:37.049 46 DEBUG oslo_concurrency.lockutils [req-284f3071-8eee-4dcb-903c-838f2e024b48 40ca1490773f49f791d3a834af3702c8 8671bdf05abf48f58a9bdcdb0ef4b740 - default default] Lock "b051d003-482d-4cb7-810c-8d256e6c879e" acquired by "nova.context.set_target_cell.<locals>.get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3.6/site-packages/oslo_concurrency/lockutils.py:327 2019-01-17 13:59:37.050 46 DEBUG oslo_concurrency.lockutils [req-284f3071-8eee-4dcb-903c-838f2e024b48 40ca1490773f49f791d3a834af3702c8 8671bdf05abf48f58a9bdcdb0ef4b740 - default default] Lock "b051d003-482d-4cb7-810c-8d256e6c879e" released by "nova.context.set_target_cell.<locals>.get_or_set_cached_cell_and_set_connections" :: held 0.002s inner /usr/lib/python3.6/site-packages/oslo_concurrency/lockutils.py:339 2019-01-17 13:59:37.060 46 DEBUG oslo_db.sqlalchemy.engines [req-284f3071-8eee-4dcb-903c-838f2e024b48 40ca1490773f49f791d3a834af3702c8 8671bdf05abf48f58a9bdcdb0ef4b740 - default default] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3.6/site-packages/oslo_db/sqlalchemy/engines.py:307 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server [req-284f3071-8eee-4dcb-903c-838f2e024b48 40ca1490773f49f791d3a834af3702c8 8671bdf05abf48f58a9bdcdb0ef4b740 - default default] Exception during message handling: TypeError: object() takes no parameters 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server Traceback (most recent call last): 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_cache/_memcache_pool.py", line 163, in _get 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server conn = self.queue.pop().connection 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server IndexError: pop from an empty deque 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server Traceback (most recent call last): 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_messaging/rpc/server.py", line 166, in _process_incoming 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_messaging/rpc/dispatcher.py", line 265, in dispatch 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_messaging/rpc/dispatcher.py", line 194, in _do_dispatch 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/nova/conductor/manager.py", line 1303, in schedule_and_build_instances 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server instance.create() 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_versionedobjects/base.py", line 226, in wrapper 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server return fn(self, *args, **kwargs) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/nova/objects/instance.py", line 607, in create 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server self._load_ec2_ids() 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/nova/objects/instance.py", line 983, in _load_ec2_ids 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server self.ec2_ids = objects.EC2Ids.get_by_instance(self._context, self) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_versionedobjects/base.py", line 184, in wrapper 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server result = fn(cls, context, *args, **kwargs) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/nova/objects/ec2.py", line 216, in get_by_instance 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server ec2_ids = cls._get_ec2_ids(context, instance) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/nova/objects/ec2.py", line 200, in _get_ec2_ids 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server ec2_ids['instance_id'] = ec2utils.id_to_ec2_inst_id(instance.uuid) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/nova/api/ec2/ec2utils.py", line 188, in id_to_ec2_inst_id 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server int_id = get_int_id_from_instance_uuid(ctxt, instance_id) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/nova/api/ec2/ec2utils.py", line 47, in memoizer 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server value = _CACHE.get(key) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/nova/cache_utils.py", line 107, in get 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server value = self.region.get(key) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/dogpile/cache/region.py", line 645, in get 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server value = self.backend.get(key) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/dogpile/cache/backends/memcached.py", line 161, in get 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server value = self.client.get(key) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_cache/backends/memcache_pool.py", line 31, in _run_method 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server with self.client_pool.acquire() as client: 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib64/python3.6/contextlib.py", line 81, in __enter__ 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server return next(self.gen) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_cache/_memcache_pool.py", line 128, in acquire 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server conn = self.get(timeout=self._connection_get_timeout) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/eventlet/queue.py", line 303, in get 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server return self._get() 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_cache/_memcache_pool.py", line 214, in _get 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server conn = ConnectionPool._get(self) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_cache/_memcache_pool.py", line 165, in _get 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server conn = self._create_connection() 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_cache/_memcache_pool.py", line 206, in _create_connection 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server return _MemcacheClient(self.urls, **self._arguments) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server TypeError: object() takes no parameters 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server nova.conf has: ()[nova@standalone /]$ grep -v ^# /etc/nova/nova.conf |grep '[a-zA-Z]' |grep -i mem backend=oslo_cache.memcache_pool memcache_servers=192.168.24.2:11211 memcached_servers=192.168.24.2:11211 memcache seems to be up: [root@standalone ~]# podman top memcached USER PID PPID %CPU ELAPSED TTY TIME COMMAND memcached 1 0 0.000 2h2m27.45050385s ? 0s dumb-init --single-child -- /bin/bash -c source /etc/sysconfig/memcached; /usr/bin/memcached -p ${PORT} -u ${USER} -m ${CACHESIZE} -c ${MAXCONN} $OPTIONS memcached 8 1 0.000 2h2m27.450722828s ? 0s /bin/bash -c source /etc/sysconfig/memcached; /usr/bin/memcached -p ${PORT} -u ${USER} -m ${CACHESIZE} -c ${MAXCONN} $OPTIONS memcached 9 8 0.000 2h2m27.450781466s ? 0s /usr/bin/memcached -p 11211 -u memcached -m 11970 -c 8192 -v -l 192.168.24.2 -U 0 -X -t 8 >> /var/log/memcached.log 2>&1 Workaround: disable caching in nova crudini --set /var/lib/config-data/puppet-generated/nova/etc/nova/nova.conf cache enabled false podman restart nova_conductor Versions inside the f28 nova-conductor container: python-oslo-versionedobjects-lang-1.34.1-0.20181128123056.50474ad.fc28.noarch python3-oslo-config-6.7.0-0.20181108120643.64e020a.fc28.noarch python3-oslo-cache-1.32.0-0.20190108083242.eb68d73.fc28.noarch python3-oslo-versionedobjects-1.34.1-0.20181128123056.50474ad.fc28.noarch python-oslo-middleware-lang-3.37.0-0.20181211135004.a609e68.fc28.noarch puppet-oslo-14.2.0-0.20190111032249.b937844.fc28.noarch python3-oslo-middleware-3.37.0-0.20181211135004.a609e68.fc28.noarch python3-oslo-service-1.34.0-0.20190114140259.d987a4a.fc28.noarch python3-oslo-policy-1.44.0-0.20190108082943.c9ea8f7.fc28.noarch python-oslo-policy-lang-1.44.0-0.20190108082943.c9ea8f7.fc28.noarch python-oslo-concurrency-lang-3.29.0-0.20181128184857.0767ddf.fc28.noarch python3-oslo-utils-3.39.0-0.20190110184625.3823707.fc28.noarch python3-oslo-vmware-2.32.1-0.20181126101324.04f82ae.fc28.noarch python-oslo-log-lang-3.42.2-0.20190114115634.1babd44.fc28.noarch python3-oslo-rootwrap-5.15.1-0.20181226111925.27f2314.fc28.noarch python3-oslo-context-2.22.0-0.20190114134914.f65408d.fc28.noarch python3-oslo-privsep-1.30.1-0.20181127103633.9391cbf.fc28.noarch python-oslo-utils-lang-3.39.0-0.20190110184625.3823707.fc28.noarch python-oslo-db-lang-4.43.0-0.20190108081838.a6d2cc5.fc28.noarch python3-memcached-1.58-5.fc29.noarch python-oslo-cache-lang-1.32.0-0.20190108083242.eb68d73.fc28.noarch python3-oslo-i18n-3.23.0-0.20190111201133.a5fde9a.fc28.noarch python3-oslo-concurrency-3.29.0-0.20181128184857.0767ddf.fc28.noarch python3-oslo-db-4.43.0-0.20190108081838.a6d2cc5.fc28.noarch python3-oslo-reports-1.29.1-0.20181126102912.dde49a4.fc28.noarch python-oslo-i18n-lang-3.23.0-0.20190111201133.a5fde9a.fc28.noarch python3-oslo-messaging-9.3.0-0.20190114135848.13fa4f5.fc28.noarch python3-oslo-serialization-2.28.1-0.20181001122254.0371c1d.fc28.noarch python-oslo-vmware-lang-2.32.1-0.20181126101324.04f82ae.fc28.noarch python-oslo-privsep-lang-1.30.1-0.20181127103633.9391cbf.fc28.noarch python3-oslo-log-3.42.2-0.20190114115634.1babd44.fc28.noarch puppet-memcached-3.3.0-0.20180803162752.e517b44.fc28.noarch ------------------------------------------------ Ubuntu SRU details ------------------ [Impact] See description above. [Test Case] [Regression Potential] The regression potential is low. This is a minimal fix that has successfully been reviewed upstream and passed all upstream gate tests. It has already landed in upstream master branch and Ubuntu Disco and received 3 +1's and zuul tests +1 on stable/rocky gerrit reviews. nova conductor running on a rhel8 host inside f28 based containers hits the following error: 2019-01-17 13:59:37.049 46 DEBUG oslo_concurrency.lockutils [req-284f3071-8eee-4dcb-903c-838f2e024b48 40ca1490773f49f791d3a834af3702c8 8671bdf05abf48f58a9bdcdb0ef4b740 - default default] Lock "b051d003-482d-4cb7-810c-8d256e6c879e" acquired by "nova.context.set_target_cell.<locals>.get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3.6/site-packages/oslo_concurrency/lockutils.py:327 2019-01-17 13:59:37.050 46 DEBUG oslo_concurrency.lockutils [req-284f3071-8eee-4dcb-903c-838f2e024b48 40ca1490773f49f791d3a834af3702c8 8671bdf05abf48f58a9bdcdb0ef4b740 - default default] Lock "b051d003-482d-4cb7-810c-8d256e6c879e" released by "nova.context.set_target_cell.<locals>.get_or_set_cached_cell_and_set_connections" :: held 0.002s inner /usr/lib/python3.6/site-packages/oslo_concurrency/lockutils.py:339 2019-01-17 13:59:37.060 46 DEBUG oslo_db.sqlalchemy.engines [req-284f3071-8eee-4dcb-903c-838f2e024b48 40ca1490773f49f791d3a834af3702c8 8671bdf05abf48f58a9bdcdb0ef4b740 - default default] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3.6/site-packages/oslo_db/sqlalchemy/engines.py:307 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server [req-284f3071-8eee-4dcb-903c-838f2e024b48 40ca1490773f49f791d3a834af3702c8 8671bdf05abf48f58a9bdcdb0ef4b740 - default default] Exception during message handling: TypeError: object() takes no parameters 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server Traceback (most recent call last): 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_cache/_memcache_pool.py", line 163, in _get 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server conn = self.queue.pop().connection 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server IndexError: pop from an empty deque 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server Traceback (most recent call last): 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_messaging/rpc/server.py", line 166, in _process_incoming 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_messaging/rpc/dispatcher.py", line 265, in dispatch 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_messaging/rpc/dispatcher.py", line 194, in _do_dispatch 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/nova/conductor/manager.py", line 1303, in schedule_and_build_instances 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server instance.create() 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_versionedobjects/base.py", line 226, in wrapper 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server return fn(self, *args, **kwargs) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/nova/objects/instance.py", line 607, in create 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server self._load_ec2_ids() 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/nova/objects/instance.py", line 983, in _load_ec2_ids 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server self.ec2_ids = objects.EC2Ids.get_by_instance(self._context, self) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_versionedobjects/base.py", line 184, in wrapper 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server result = fn(cls, context, *args, **kwargs) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/nova/objects/ec2.py", line 216, in get_by_instance 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server ec2_ids = cls._get_ec2_ids(context, instance) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/nova/objects/ec2.py", line 200, in _get_ec2_ids 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server ec2_ids['instance_id'] = ec2utils.id_to_ec2_inst_id(instance.uuid) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/nova/api/ec2/ec2utils.py", line 188, in id_to_ec2_inst_id 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server int_id = get_int_id_from_instance_uuid(ctxt, instance_id) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/nova/api/ec2/ec2utils.py", line 47, in memoizer 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server value = _CACHE.get(key) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/nova/cache_utils.py", line 107, in get 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server value = self.region.get(key) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/dogpile/cache/region.py", line 645, in get 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server value = self.backend.get(key) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/dogpile/cache/backends/memcached.py", line 161, in get 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server value = self.client.get(key) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_cache/backends/memcache_pool.py", line 31, in _run_method 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server with self.client_pool.acquire() as client: 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib64/python3.6/contextlib.py", line 81, in __enter__ 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server return next(self.gen) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_cache/_memcache_pool.py", line 128, in acquire 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server conn = self.get(timeout=self._connection_get_timeout) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/eventlet/queue.py", line 303, in get 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server return self._get() 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_cache/_memcache_pool.py", line 214, in _get 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server conn = ConnectionPool._get(self) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_cache/_memcache_pool.py", line 165, in _get 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server conn = self._create_connection() 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_cache/_memcache_pool.py", line 206, in _create_connection 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server return _MemcacheClient(self.urls, **self._arguments) 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server TypeError: object() takes no parameters 2019-01-17 13:59:37.096 46 ERROR oslo_messaging.rpc.server nova.conf has: ()[nova@standalone /]$ grep -v ^# /etc/nova/nova.conf |grep '[a-zA-Z]' |grep -i mem backend=oslo_cache.memcache_pool memcache_servers=192.168.24.2:11211 memcached_servers=192.168.24.2:11211 memcache seems to be up: [root@standalone ~]# podman top memcached USER PID PPID %CPU ELAPSED TTY TIME COMMAND memcached 1 0 0.000 2h2m27.45050385s ? 0s dumb-init --single-child -- /bin/bash -c source /etc/sysconfig/memcached; /usr/bin/memcached -p ${PORT} -u ${USER} -m ${CACHESIZE} -c ${MAXCONN} $OPTIONS memcached 8 1 0.000 2h2m27.450722828s ? 0s /bin/bash -c source /etc/sysconfig/memcached; /usr/bin/memcached -p ${PORT} -u ${USER} -m ${CACHESIZE} -c ${MAXCONN} $OPTIONS memcached 9 8 0.000 2h2m27.450781466s ? 0s /usr/bin/memcached -p 11211 -u memcached -m 11970 -c 8192 -v -l 192.168.24.2 -U 0 -X -t 8 >> /var/log/memcached.log 2>&1 Workaround: disable caching in nova crudini --set /var/lib/config-data/puppet-generated/nova/etc/nova/nova.conf cache enabled false podman restart nova_conductor Versions inside the f28 nova-conductor container: python-oslo-versionedobjects-lang-1.34.1-0.20181128123056.50474ad.fc28.noarch python3-oslo-config-6.7.0-0.20181108120643.64e020a.fc28.noarch python3-oslo-cache-1.32.0-0.20190108083242.eb68d73.fc28.noarch python3-oslo-versionedobjects-1.34.1-0.20181128123056.50474ad.fc28.noarch python-oslo-middleware-lang-3.37.0-0.20181211135004.a609e68.fc28.noarch puppet-oslo-14.2.0-0.20190111032249.b937844.fc28.noarch python3-oslo-middleware-3.37.0-0.20181211135004.a609e68.fc28.noarch python3-oslo-service-1.34.0-0.20190114140259.d987a4a.fc28.noarch python3-oslo-policy-1.44.0-0.20190108082943.c9ea8f7.fc28.noarch python-oslo-policy-lang-1.44.0-0.20190108082943.c9ea8f7.fc28.noarch python-oslo-concurrency-lang-3.29.0-0.20181128184857.0767ddf.fc28.noarch python3-oslo-utils-3.39.0-0.20190110184625.3823707.fc28.noarch python3-oslo-vmware-2.32.1-0.20181126101324.04f82ae.fc28.noarch python-oslo-log-lang-3.42.2-0.20190114115634.1babd44.fc28.noarch python3-oslo-rootwrap-5.15.1-0.20181226111925.27f2314.fc28.noarch python3-oslo-context-2.22.0-0.20190114134914.f65408d.fc28.noarch python3-oslo-privsep-1.30.1-0.20181127103633.9391cbf.fc28.noarch python-oslo-utils-lang-3.39.0-0.20190110184625.3823707.fc28.noarch python-oslo-db-lang-4.43.0-0.20190108081838.a6d2cc5.fc28.noarch python3-memcached-1.58-5.fc29.noarch python-oslo-cache-lang-1.32.0-0.20190108083242.eb68d73.fc28.noarch python3-oslo-i18n-3.23.0-0.20190111201133.a5fde9a.fc28.noarch python3-oslo-concurrency-3.29.0-0.20181128184857.0767ddf.fc28.noarch python3-oslo-db-4.43.0-0.20190108081838.a6d2cc5.fc28.noarch python3-oslo-reports-1.29.1-0.20181126102912.dde49a4.fc28.noarch python-oslo-i18n-lang-3.23.0-0.20190111201133.a5fde9a.fc28.noarch python3-oslo-messaging-9.3.0-0.20190114135848.13fa4f5.fc28.noarch python3-oslo-serialization-2.28.1-0.20181001122254.0371c1d.fc28.noarch python-oslo-vmware-lang-2.32.1-0.20181126101324.04f82ae.fc28.noarch python-oslo-privsep-lang-1.30.1-0.20181127103633.9391cbf.fc28.noarch python3-oslo-log-3.42.2-0.20190114115634.1babd44.fc28.noarch puppet-memcached-3.3.0-0.20180803162752.e517b44.fc28.noarch ------------------------------------------------ Ubuntu SRU details ------------------ [Impact] See description above. [Test Case] 1. Deploy an HA Rocky cloud using the openstack-next-charms, with memcache related to nova-cloud-controller. 2. Configure a simple network, upload a bionic image. 3. Try to start an instance via the API. It will stay in build. 4. Try to use the openstack cli to read availability zones. It will return an error. For more info see bug 1823740 [Regression Potential] The regression potential is low. This is a minimal fix that has successfully been reviewed upstream and passed all upstream gate tests. It has already landed in upstream master branch and Ubuntu Disco and received 3 +1's and zuul tests +1 on stable/rocky gerrit reviews.
2019-04-09 22:28:39 Brian Murray python-oslo.cache (Ubuntu Cosmic): status Triaged Fix Committed
2019-04-09 22:28:42 Brian Murray bug added subscriber SRU Verification
2019-04-09 22:28:46 Brian Murray tags cdo-qa cdo-release-blocker foundations-engine cdo-qa cdo-release-blocker foundations-engine verification-needed verification-needed-cosmic
2019-04-10 12:44:44 Corey Bryant cloud-archive/rocky: status Triaged Fix Committed
2019-04-10 12:44:46 Corey Bryant tags cdo-qa cdo-release-blocker foundations-engine verification-needed verification-needed-cosmic cdo-qa cdo-release-blocker foundations-engine verification-needed verification-needed-cosmic verification-rocky-needed
2019-04-16 14:39:20 Jason Hobbs tags cdo-qa cdo-release-blocker foundations-engine verification-needed verification-needed-cosmic verification-rocky-needed cdo-qa cdo-release-blocker foundations-engine verification-needed verification-needed-cosmic verification-rocky-done
2019-04-17 14:24:03 James Page cloud-archive/rocky: status Fix Committed Fix Released
2019-04-24 14:14:57 OpenStack Infra tags cdo-qa cdo-release-blocker foundations-engine verification-needed verification-needed-cosmic verification-rocky-done cdo-qa cdo-release-blocker foundations-engine in-stable-rocky verification-needed verification-needed-cosmic verification-rocky-done
2019-05-10 15:18:32 OpenStack Infra tags cdo-qa cdo-release-blocker foundations-engine in-stable-rocky verification-needed verification-needed-cosmic verification-rocky-done cdo-qa cdo-release-blocker foundations-engine in-stable-queens in-stable-rocky verification-needed verification-needed-cosmic verification-rocky-done
2020-02-26 14:35:30 OpenStack Infra oslo.cache: status In Progress Fix Released