Attempting to start or hard reboot a users instance as an admin with encrypted volumes leaves the instance unbootable when [workarounds]disable_native_luksv1 is enabled

Bug #1917619 reported by Lee Yarwood
10
This bug affects 2 people
Affects Status Importance Assigned to Milestone
OpenStack Compute (nova)
Fix Released
Medium
Lee Yarwood
Wallaby
Fix Released
Undecided
Unassigned

Bug Description

Description
===========
$subject, by default admins do not have access to user created barbican secrets. As a result admins cannot hard reboot or stop/start instances as this deletes local libvirt secrets, refetches secrets from Barbican and recreates the local secrets.

However this initial attempt by an admin will destroy the local secrets *before* failing to access anything in Barbican.

As a result any request by the owner of the instance to hard reboot or stop/start the instance can fail as the _detach_encryptor logic fails to find any local secret and assumes that native LUKSv1 encryption isn't being used. This causes the os-brick encryptors to be loaded that can fail if the underlying volume type isn't supported, such as rbd.

Steps to reproduce
==================
1. As an non-admin user create an instance with encrypted rbd volumes attached
2. Attempt to hard reboot or stop/start the instance as an admin
3. Attempt to hard reboot or stop/start the instance as the owner

Expected result
===============
The request by the admin to hard reboot or stop/start the instance fails.
The request by the owner to hard reboot or stop/start the instance fails due to os_brick.exception.VolumeEncryptionNotSupported being raised.

Actual result
=============
The request by the admin to hard reboot or stop/start the instance fails.
The request by the owner to hard reboot or stop/start the instance succeeds.

Environment
===========
1. Exact version of OpenStack you are running. See the following
  list for all releases: http://docs.openstack.org/releases/

   master

2. Which hypervisor did you use?
   (For example: Libvirt + KVM, Libvirt + XEN, Hyper-V, PowerKVM, ...)
   What's the version of that?

   libvirt

2. Which storage type did you use?
   (For example: Ceph, LVM, GPFS, ...)
   What's the version of that?

   N/A

3. Which networking type did you use?
   (For example: nova-network, Neutron with OpenVSwitch, ...)

   N/A

Logs & Configs
==============

https://bugzilla.redhat.com/show_bug.cgi?id=1934513

2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server [req-fe304872-e35f-4cb3-8760-4fd1eed745bc fef8c04ca63ab77e9a37b9d79367fd49747d2016352759f6faa8475fbf6f63c1 4127275f099844f28fde120064aa4753 - 1d485afd913b4c489730f79d83044080 1d485afd913b4c489730f79d83044080] Exception during message handling: os_brick.exception.VolumeEncryptionNotSupported: Volume encryption is not supported for rbd volume d9817c6a-9c84-472a-8fc8-58ad73b389aa.
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message)
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_messaging/rpc/dispatcher.py", line 274, in dispatch
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args)
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_messaging/rpc/dispatcher.py", line 194, in _do_dispatch
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args)
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/nova/exception_wrapper.py", line 79, in wrapped
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server function_name, call_dict, binary, tb)
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 220, in __exit__
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server self.force_reraise()
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 196, in force_reraise
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server six.reraise(self.type_, self.value, self.tb)
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/six.py", line 693, in reraise
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server raise value
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/nova/exception_wrapper.py", line 69, in wrapped
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw)
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/nova/compute/manager.py", line 191, in decorated_function
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server "Error: %s", e, instance=instance)
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 220, in __exit__
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server self.force_reraise()
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 196, in force_reraise
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server six.reraise(self.type_, self.value, self.tb)
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/six.py", line 693, in reraise
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server raise value
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/nova/compute/manager.py", line 161, in decorated_function
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs)
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/nova/compute/utils.py", line 1372, in decorated_function
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs)
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/nova/compute/manager.py", line 219, in decorated_function
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server kwargs['instance'], e, sys.exc_info())
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 220, in __exit__
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server self.force_reraise()
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 196, in force_reraise
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server six.reraise(self.type_, self.value, self.tb)
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/six.py", line 693, in reraise
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server raise value
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/nova/compute/manager.py", line 207, in decorated_function
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs)
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/nova/compute/manager.py", line 3140, in start_instance
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server self._power_on(context, instance)
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/nova/compute/manager.py", line 3110, in _power_on
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server block_device_info)
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/nova/virt/libvirt/driver.py", line 3459, in power_on
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server self._hard_reboot(context, instance, network_info, block_device_info)
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/nova/virt/libvirt/driver.py", line 3306, in _hard_reboot
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server block_device_info=block_device_info)
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/nova/virt/libvirt/driver.py", line 1316, in destroy
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server destroy_disks)
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/nova/virt/libvirt/driver.py", line 1389, in cleanup
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server cleanup_instance_disks=cleanup_instance_disks)
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/nova/virt/libvirt/driver.py", line 1474, in _cleanup
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server instance=instance)
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 220, in __exit__
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server self.force_reraise()
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 196, in force_reraise
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server six.reraise(self.type_, self.value, self.tb)
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/six.py", line 693, in reraise
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server raise value
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/nova/virt/libvirt/driver.py", line 1461, in _cleanup
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server self._disconnect_volume(context, connection_info, instance)
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/nova/virt/libvirt/driver.py", line 1687, in _disconnect_volume
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server self._detach_encryptor(context, connection_info, encryption=encryption)
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/nova/virt/libvirt/driver.py", line 1813, in _detach_encryptor
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server encryption)
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/nova/virt/libvirt/driver.py", line 1730, in _get_volume_encryptor
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server **encryption)
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/os_brick/encryptors/__init__.py", line 93, in get_volume_encryptor
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server **kwargs)
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_utils/importutils.py", line 44, in import_object
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server return import_class(import_str)(*args, **kwargs)
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/os_brick/encryptors/luks.py", line 61, in __init__
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server *args, **kwargs)
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/os_brick/encryptors/cryptsetup.py", line 55, in __init__
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server volume_type=connection_info['driver_volume_type'])
2021-02-23 17:07:50.453 7 ERROR oslo_messaging.rpc.server os_brick.exception.VolumeEncryptionNotSupported: Volume encryption is not supported for rbd volume d9817c6a-9c84-472a-8fc8-58ad73b389aa.

Lee Yarwood (lyarwood)
description: updated
summary: Attempting to start or hard reboot a users instance as an admin with
- encrypted volumes leaves the instance unbootable
+ encrypted volumes leaves the instance unbootable when
+ [workarounds]disable_native_luksv1 is enabled
Revision history for this message
Balazs Gibizer (balazs-gibizer) wrote :
Changed in nova:
importance: Undecided → Medium
status: New → In Progress
Revision history for this message
OpenStack Infra (hudson-openstack) wrote : Fix merged to nova (stable/wallaby)

Reviewed: https://review.opendev.org/c/openstack/nova/+/785577
Committed: https://opendev.org/openstack/nova/commit/f99f667a96a357adc0070d75b5940e76726f9664
Submitter: "Zuul (22348)"
Branch: stable/wallaby

commit f99f667a96a357adc0070d75b5940e76726f9664
Author: Lee Yarwood <email address hidden>
Date: Wed Mar 3 12:33:49 2021 +0000

    libvirt: Simplify device_path check in _detach_encryptor

    Introduced by Id670f13a7f197e71c77dc91276fc2fba2fc5f314 to resolve bug
     #1821696 this check was put in place to ensure _detach_encryptor did not
    attempt to use the os-brick encryptors with an unsupported volume type
    after libvirt secrets had been removed outside the control of Nova.

    With the introduction of the [workarounds]disable_native_luksv1 via
    Ia500eb614cf575ab846f64f4b69c9068274c8c1f however the use of
    _allow_native_luksv1 as part of this check is no longer valid. As this
    helper was updated to return False when the workaround is enabled,
    regardless of the underlying volume being attached natively or not.

    If an admin had enabled the workaround after users had launched
    instances with natively attached encrypted volumes *and* the libvirt
    secrets had gone missing _detach_encryptor would attempt to use the
    os-brick encryptors. This would fail when the underlying volume type is
    unsupported, for example rbd. See bug #1917619 for an example.

    This change resolves this corner case by dropping the use of
    _allow_native_luksv1 from the check and just asserting that a
    device_path is present for an encrypted volume before allowing the use
    of the os-brick encryptors. As noted this is safe as calls to the
    encryptors are idempotent, ignoring failures to detach when the
    underlying volume type is supported.

    Closes-Bug: #1917619
    Change-Id: Iba40c2df72228b461767d5734d5a62403d9f2cfa
    (cherry picked from commit 4908daed96ddda492ced6cbb084abe8f33a8b1f7)

tags: added: in-stable-wallaby
Revision history for this message
OpenStack Infra (hudson-openstack) wrote : Fix included in openstack/nova 23.0.1

This issue was fixed in the openstack/nova 23.0.1 release.

Lee Yarwood (lyarwood)
Changed in nova:
status: In Progress → Fix Released
Revision history for this message
OpenStack Infra (hudson-openstack) wrote : Fix included in openstack/nova 24.0.0.0rc1

This issue was fixed in the openstack/nova 24.0.0.0rc1 release candidate.

Revision history for this message
OpenStack Infra (hudson-openstack) wrote : Change abandoned on nova (stable/ussuri)

Change abandoned by "Elod Illes <email address hidden>" on branch: stable/ussuri
Review: https://review.opendev.org/c/openstack/nova/+/785585
Reason: stable/ussuri branch of openstack/nova transitioned to End of Life and is about to be deleted. To be able to do that, all open patches need to be abandoned.

Revision history for this message
OpenStack Infra (hudson-openstack) wrote : Change abandoned on nova (stable/victoria)

Change abandoned by "Elod Illes <email address hidden>" on branch: stable/victoria
Review: https://review.opendev.org/c/openstack/nova/+/785584
Reason: stable/victoria branch of openstack/nova is about to be deleted. To be able to do that, all open patches need to be abandoned. Please cherry pick the patch to unmaintained/victoria if you want to further work on this patch.

To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.