Create instance from volume fails on reschedule

Bug #1936474 reported by Rajat Dhasmana
6
This bug affects 1 person
Affects Status Importance Assigned to Milestone
Cinder
In Progress
High
Rajat Dhasmana

Bug Description

When we create an instance using volume, if the instance create fails for the first time, it tries to reschedule it using the same volume.
Because of improper cleanup of image metadata on the cinder side, the reschedule fails with the following error
cinder.exception.GlanceMetadataExists: Glance metadata cannot be updated, key signature_verified exists for volume id 7ecdc5b4-304f-42ac-914e-3671f34f97a2

Trace:

Jul 16 07:37:42 roraut-devstack cinder-volume[7670]: ERROR oslo_messaging.rpc.server [req-8e09ff27-d56b-40b8-bfc9-af6a35e16603 req-57c1d617-209a-438c-b8a7-7f57abf0ccaa admin None] Exception during message handling: cinder.exception.GlanceMetadataExists: Glance metadata cannot be updated, key signature_verified exists for volume id 7ecdc5b4-304f-42ac-914e-3671f34f97a2
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server Traceback (most recent call last):
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server File "/usr/local/lib/python3.8/dist-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message)
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server File "/usr/local/lib/python3.8/dist-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args)
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server File "/usr/local/lib/python3.8/dist-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args)
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server File "<decorator-gen-751>", line 2, in create_volume
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server File "/opt/stack/cinder/cinder/objects/cleanable.py", line 208, in wrapper
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs)
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server File "/opt/stack/cinder/cinder/volume/manager.py", line 771, in create_volume
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server _run_flow()
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server File "/opt/stack/cinder/cinder/volume/manager.py", line 763, in _run_flow
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server flow_engine.run()
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server File "/usr/local/lib/python3.8/dist-packages/taskflow/engines/action_engine/engine.py", line 247, in run
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server for _state in self.run_iter(timeout=timeout):
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server File "/usr/local/lib/python3.8/dist-packages/taskflow/engines/action_engine/engine.py", line 340, in run_iter
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server failure.Failure.reraise_if_any(er_failures)
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server File "/usr/local/lib/python3.8/dist-packages/taskflow/types/failure.py", line 339, in reraise_if_any
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server failures[0].reraise()
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server File "/usr/local/lib/python3.8/dist-packages/taskflow/types/failure.py", line 346, in reraise
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server six.reraise(*self._exc_info)
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server File "/usr/local/lib/python3.8/dist-packages/six.py", line 719, in reraise
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server raise value
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server File "/usr/local/lib/python3.8/dist-packages/taskflow/engines/action_engine/executor.py", line 53, in _execute_task
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server result = task.execute(**arguments)
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server File "/opt/stack/cinder/cinder/volume/flows/manager/create_volume.py", line 1132, in execute
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server model_update = self._create_from_image(context,
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server File "/opt/stack/cinder/cinder/utils.py", line 616, in _wrapper
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server return r.call(f, *args, **kwargs)
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server File "/usr/local/lib/python3.8/dist-packages/tenacity/__init__.py", line 411, in call
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server return self.__call__(*args, **kwargs)
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server File "/usr/local/lib/python3.8/dist-packages/tenacity/__init__.py", line 423, in __call__
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server do = self.iter(retry_state=retry_state)
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server File "/usr/local/lib/python3.8/dist-packages/tenacity/__init__.py", line 360, in iter
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server return fut.result()
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server File "/usr/lib/python3.8/concurrent/futures/_base.py", line 437, in result
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server return self.__get_result()
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server File "/usr/lib/python3.8/concurrent/futures/_base.py", line 389, in __get_result
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server raise self._exception
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server File "/usr/local/lib/python3.8/dist-packages/tenacity/__init__.py", line 426, in __call__
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server result = fn(*args, **kwargs)
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server File "/opt/stack/cinder/cinder/volume/flows/manager/create_volume.py", line 1018, in _create_from_image
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server model_update, cloned = self._prepare_image_cache_entry(
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server File "<decorator-gen-750>", line 2, in _prepare_image_cache_entry
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server File "/opt/stack/cinder/cinder/coordination.py", line 151, in _synchronized
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server return f(*a, **k)
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server File "/opt/stack/cinder/cinder/volume/flows/manager/create_volume.py", line 815, in _prepare_image_cache_entry
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server model_update = self._create_from_image_cache_or_download(
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server File "/opt/stack/cinder/cinder/volume/flows/manager/create_volume.py", line 905, in _create_from_image_cache_or_download
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server self.db.volume_glance_metadata_bulk_create(
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server File "/opt/stack/cinder/cinder/db/api.py", line 988, in volume_glance_metadata_bulk_create
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server return IMPL.volume_glance_metadata_bulk_create(context, volume_id,
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server File "/opt/stack/cinder/cinder/db/sqlalchemy/api.py", line 182, in wrapper
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs)
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server File "/opt/stack/cinder/cinder/db/sqlalchemy/api.py", line 197, in wrapper
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server return f(context, volume_id, *args, **kwargs)
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server File "/opt/stack/cinder/cinder/db/sqlalchemy/api.py", line 5128, in volume_glance_metadata_bulk_create
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server raise exception.GlanceMetadataExists(key=key,
Jul 16 07:37:42 cinder-volume[7670]: ERROR oslo_messaging.rpc.server cinder.exception.GlanceMetadataExists: Glance metadata cannot be updated, key signature_verified exists for volume id 7ecdc5b4-304f-42ac-914e-3671f34f97a2

Changed in cinder:
status: New → In Progress
Revision history for this message
Sofia Enriquez (lsofia-enriquez) wrote :

Greetings Rajat, are you facing this issue with RBD for both glance and cinder or are you using another backend? Thanks in advance, Sofia

tags: added: glance image volume
Changed in cinder:
importance: Undecided → High
status: In Progress → Incomplete
Revision history for this message
Rajat Dhasmana (whoami-rajat) wrote :

Somehow i forgot that this linking is broken.

Proposed patch: https://review.opendev.org/c/openstack/cinder/+/800768

Changed in cinder:
assignee: nobody → Rajat Dhasmana (whoami-rajat)
Revision history for this message
Rajat Dhasmana (whoami-rajat) wrote :

Hi Sofia,

This is not backend related, generally when we create a bootable volume from image, in case of an error, we don't clean up the image metadata entry in DB. Incase of creating instance from bootable volume, during first failure, nova reschedules instance creation and we see the above trace during second attempt due to a residue glance image metadata entry in DB.

Changed in cinder:
status: Incomplete → In Progress
To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.