DeviceTaggingTestV2_42.test_device_tagging randomly fails with "mount: mounting /dev/sr0 on /mnt failed: Device or resource busy"

Bug #1706397 reported by Matt Riedemann
6
This bug affects 1 person
Affects Status Importance Assigned to Milestone
tempest
Confirmed
Low
Unassigned

Bug Description

http://logs.openstack.org/66/483566/10/check/gate-tempest-dsvm-neutron-nova-next-full-ubuntu-xenial-nv/c816bd9/logs/tempest.txt.gz#_2017-07-24_16_43_36_846

2017-07-24 16:43:36.846 1571 ERROR tempest.lib.common.utils.linux.remote_client [-] (DeviceTaggingTestV2_42:test_device_tagging) Initializing SSH connection to 172.24.5.4 failed. Error: Command 'set -eu -o pipefail; PATH=$PATH:/sbin; sudo mount /dev/sr0 /mnt', exit status: 255, stderr:
mount: mounting /dev/sr0 on /mnt failed: Device or resource busy

stdout:
: SSHExecCommandFailed: Command 'set -eu -o pipefail; PATH=$PATH:/sbin; sudo mount /dev/sr0 /mnt', exit status: 255, stderr:
mount: mounting /dev/sr0 on /mnt failed: Device or resource busy

stdout:

http://logs.openstack.org/66/483566/10/check/gate-tempest-dsvm-neutron-nova-next-full-ubuntu-xenial-nv/c816bd9/console.html#_2017-07-24_17_19_13_176110

2017-07-24 17:19:13.176110 | tempest.api.compute.servers.test_device_tagging.DeviceTaggingTestV2_42.test_device_tagging[id-a2e65a6c-66f1-4442-aaa8-498c31778d96,image,network,volume]
2017-07-24 17:19:13.176163 | --------------------------------------------------------------------------------------------------------------------------------------------------------
2017-07-24 17:19:13.176171 |
2017-07-24 17:19:13.176181 | Captured traceback:
2017-07-24 17:19:13.176191 | ~~~~~~~~~~~~~~~~~~~
2017-07-24 17:19:13.176205 | Traceback (most recent call last):
2017-07-24 17:19:13.176221 | File "tempest/test.py", line 103, in wrapper
2017-07-24 17:19:13.176237 | return f(self, *func_args, **func_kwargs)
2017-07-24 17:19:13.176263 | File "tempest/api/compute/servers/test_device_tagging.py", line 271, in test_device_tagging
2017-07-24 17:19:13.176284 | self.ssh_client.exec_command('sudo mount %s /mnt' % dev_name)
2017-07-24 17:19:13.176307 | File "tempest/lib/common/utils/linux/remote_client.py", line 30, in wrapper
2017-07-24 17:19:13.176323 | return function(self, *args, **kwargs)
2017-07-24 17:19:13.176346 | File "tempest/lib/common/utils/linux/remote_client.py", line 105, in exec_command
2017-07-24 17:19:13.176363 | return self.ssh_client.exec_command(cmd)
2017-07-24 17:19:13.176383 | File "tempest/lib/common/ssh.py", line 202, in exec_command
2017-07-24 17:19:13.176397 | stderr=err_data, stdout=out_data)
2017-07-24 17:19:13.176433 | tempest.lib.exceptions.SSHExecCommandFailed: Command 'set -eu -o pipefail; PATH=$PATH:/sbin; sudo mount /dev/sr0 /mnt', exit status: 255, stderr:
2017-07-24 17:19:13.176464 | mount: mounting /dev/sr0 on /mnt failed: Device or resource busy
2017-07-24 17:19:13.176470 |
2017-07-24 17:19:13.176478 | stdout:

The test dumps out some lsblk information after it fails:

2017-07-24 16:43:38.010 1571 ERROR tempest.api.compute.servers.test_device_tagging [-] Mounting /dev/sr0 on /mnt failed. Right after the failure 'lsblk' in the guest reported:
NAME FSTYPE LABEL MOUNTPOINT
vda
`-vda1 ext3 cirros-rootfs /
vdb
vdc
sr0 iso9660 config-2
: SSHExecCommandFailed: Command 'set -eu -o pipefail; PATH=$PATH:/sbin; sudo mount /dev/sr0 /mnt', exit status: 255, stderr:
mount: mounting /dev/sr0 on /mnt failed: Device or resource busy

stdout:

But in this case it doesn't really tell us much. Maybe we should retry up to 3 times to mount the config drive to read from it?

Revision history for this message
Matt Riedemann (mriedem) wrote :

Or maybe we should dump the guest console when this fails to see if we're trying to mount the config drive too early?

Revision history for this message
Matt Riedemann (mriedem) wrote :
Matt Riedemann (mriedem)
no longer affects: nova
Changed in tempest:
status: New → Confirmed
importance: Undecided → Low
To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.