Comment 1 for bug 1931710

Revision history for this message
Balazs Gibizer (balazs-gibizer) wrote :

The same issue can happen in other scenarios e.g:

Jun 03 05:20:33.248693 ubuntu-focal-inap-mtl01-0024946450 nova-compute[90971]: ERROR nova.compute.manager Traceback (most recent call last):
Jun 03 05:20:33.248693 ubuntu-focal-inap-mtl01-0024946450 nova-compute[90971]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 9856, in _update_available_resource_for_node
Jun 03 05:20:33.248693 ubuntu-focal-inap-mtl01-0024946450 nova-compute[90971]: ERROR nova.compute.manager self.rt.update_available_resource(context, nodename,
Jun 03 05:20:33.248693 ubuntu-focal-inap-mtl01-0024946450 nova-compute[90971]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/resource_tracker.py", line 879, in update_available_resource
Jun 03 05:20:33.248693 ubuntu-focal-inap-mtl01-0024946450 nova-compute[90971]: ERROR nova.compute.manager resources = self.driver.get_available_resource(nodename)
Jun 03 05:20:33.248693 ubuntu-focal-inap-mtl01-0024946450 nova-compute[90971]: ERROR nova.compute.manager File "/opt/stack/nova/nova/virt/libvirt/driver.py", line 8858, in get_available_resource
Jun 03 05:20:33.248693 ubuntu-focal-inap-mtl01-0024946450 nova-compute[90971]: ERROR nova.compute.manager disk_info_dict = self._get_local_gb_info()
Jun 03 05:20:33.248693 ubuntu-focal-inap-mtl01-0024946450 nova-compute[90971]: ERROR nova.compute.manager File "/opt/stack/nova/nova/virt/libvirt/driver.py", line 7299, in _get_local_gb_info
Jun 03 05:20:33.248693 ubuntu-focal-inap-mtl01-0024946450 nova-compute[90971]: ERROR nova.compute.manager info = lvm.get_volume_group_info(
Jun 03 05:20:33.248693 ubuntu-focal-inap-mtl01-0024946450 nova-compute[90971]: ERROR nova.compute.manager File "/opt/stack/nova/nova/virt/libvirt/storage/lvm.py", line 92, in get_volume_group_info
Jun 03 05:20:33.248693 ubuntu-focal-inap-mtl01-0024946450 nova-compute[90971]: ERROR nova.compute.manager out, err = nova.privsep.fs.vginfo(vg)
Jun 03 05:20:33.248693 ubuntu-focal-inap-mtl01-0024946450 nova-compute[90971]: ERROR nova.compute.manager File "/usr/local/lib/python3.8/dist-packages/oslo_privsep/priv_context.py", line 247, in _wrap
Jun 03 05:20:33.248693 ubuntu-focal-inap-mtl01-0024946450 nova-compute[90971]: ERROR nova.compute.manager return self.channel.remote_call(name, args, kwargs)
Jun 03 05:20:33.248693 ubuntu-focal-inap-mtl01-0024946450 nova-compute[90971]: ERROR nova.compute.manager File "/usr/local/lib/python3.8/dist-packages/oslo_privsep/daemon.py", line 224, in remote_call
Jun 03 05:20:33.248693 ubuntu-focal-inap-mtl01-0024946450 nova-compute[90971]: ERROR nova.compute.manager raise exc_type(*result[2])
Jun 03 05:20:33.248693 ubuntu-focal-inap-mtl01-0024946450 nova-compute[90971]: ERROR nova.compute.manager oslo_concurrency.processutils.ProcessExecutionError: Unexpected error while running command.
Jun 03 05:20:33.248693 ubuntu-focal-inap-mtl01-0024946450 nova-compute[90971]: ERROR nova.compute.manager Command: vgs --noheadings --nosuffix --separator | --units b -o vg_size,vg_free stack-volumes-default
Jun 03 05:20:33.248693 ubuntu-focal-inap-mtl01-0024946450 nova-compute[90971]: ERROR nova.compute.manager Exit code: -11
Jun 03 05:20:33.248693 ubuntu-focal-inap-mtl01-0024946450 nova-compute[90971]: ERROR nova.compute.manager Stdout: ''
Jun 03 05:20:33.248693 ubuntu-focal-inap-mtl01-0024946450 nova-compute[90971]: ERROR nova.compute.manager Stderr: ' WARNING: Failed to get udev device handler for device /dev/sda1.\n /dev/sda15: stat failed: No such file or directory\n Path /dev/sda15 no longer valid for device(8,15)\n /dev/sda15: stat failed: No such file or directory\n Path /dev/sda15 no longer valid for device(8,15)\n Device open /dev/sda 8:0 failed errno 2\n Device open /dev/sda 8:0 failed errno 2\n Device open /dev/sda1 8:1 failed errno 2\n Device open /dev/sda1 8:1 failed errno 2\n WARNING: Scan ignoring device 8:0 with no paths.\n WARNING: Scan ignoring device 8:1 with no paths.\n'

In this case the job succeeds as the periodic task is just skipped. The next run of the task succeeds so it is probably a temporary error in lvm. A retry in nova could help.