Removing snapshots puts a volume into a suspended state

Bug #1706706 reported by Matthew Heler
6
This bug affects 1 person
Affects Status Importance Assigned to Milestone
Cinder
New
Undecided
Unassigned

Bug Description

We have a customer that uses LVM Cinder and relies fairly heavily on snapshots. Recently when deleting snapshots, it was reported that the origin volume the snapshot is based from would be suspended in LV when a snapshot was deleted. The cinder logs show that lvremove failed during the first go around, but then succeed on retry. This appears to only happen to a few logical volumes, and not all of the volumes in the environment. However on the volumes this issue occurs on, the issue is reproducable.

2017-07-26 13:35:22.807 6177 DEBUG oslo_concurrency.lockutils [req-a974e739-13af-4d2d-b37a-d60231fb2f81 0fe383e9dce146c3add571befdaeb581 feacaa2326204509bd19e395917b47d6 - - -] Lock "e6d93a29-7ee5-4842-bf59-bc87ea01794e-delete_snapshot" acquired by "cinder.volume.manager.lso_inner2" :: waited 0.001s inner /openstack/venvs/cinder-13.3.4/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:270
2017-07-26 13:35:22.981 6177 DEBUG oslo_concurrency.processutils [req-a974e739-13af-4d2d-b37a-d60231fb2f81 0fe383e9dce146c3add571befdaeb581 feacaa2326204509bd19e395917b47d6 - - -] Running cmd (subprocess): sudo cinder-rootwrap /etc/cinder/rootwrap.conf env LC_ALL=C lvs --noheadings --unit=g -o vg_name,name,size --nosuffix cinder-volumes/_snapshot-e6d93a29-7ee5-4842-bf59-bc87ea01794e execute /openstack/venvs/cinder-13.3.4/lib/python2.7/site-packages/oslo_concurrency/processutils.py:344
2017-07-26 13:35:23.065 6177 DEBUG oslo_concurrency.processutils [req-a974e739-13af-4d2d-b37a-d60231fb2f81 0fe383e9dce146c3add571befdaeb581 feacaa2326204509bd19e395917b47d6 - - -] CMD "sudo cinder-rootwrap /etc/cinder/rootwrap.conf env LC_ALL=C lvs --noheadings --unit=g -o vg_name,name,size --nosuffix cinder-volumes/_snapshot-e6d93a29-7ee5-4842-bf59-bc87ea01794e" returned: 0 in 0.084s execute /openstack/venvs/cinder-13.3.4/lib/python2.7/site-packages/oslo_concurrency/processutils.py:374
2017-07-26 13:35:23.066 6177 INFO cinder.volume.utils [req-a974e739-13af-4d2d-b37a-d60231fb2f81 0fe383e9dce146c3add571befdaeb581 feacaa2326204509bd19e395917b47d6 - - -] Performing secure delete on volume: /dev/mapper/cinder--volumes-_snapshot--e6d93a29--7ee5--4842--bf59--bc87ea01794e-cow
2017-07-26 13:35:23.067 6177 DEBUG oslo_concurrency.processutils [req-a974e739-13af-4d2d-b37a-d60231fb2f81 0fe383e9dce146c3add571befdaeb581 feacaa2326204509bd19e395917b47d6 - - -] Running cmd (subprocess): sudo cinder-rootwrap /etc/cinder/rootwrap.conf dd count=0 if=/dev/zero of=/dev/mapper/cinder--volumes-_snapshot--e6d93a29--7ee5--4842--bf59--bc87ea01794e-cow oflag=direct execute /openstack/venvs/cinder-13.3.4/lib/python2.7/site-packages/oslo_concurrency/processutils.py:344
2017-07-26 13:35:23.114 6177 DEBUG oslo_concurrency.processutils [req-a974e739-13af-4d2d-b37a-d60231fb2f81 0fe383e9dce146c3add571befdaeb581 feacaa2326204509bd19e395917b47d6 - - -] CMD "sudo cinder-rootwrap /etc/cinder/rootwrap.conf dd count=0 if=/dev/zero of=/dev/mapper/cinder--volumes-_snapshot--e6d93a29--7ee5--4842--bf59--bc87ea01794e-cow oflag=direct" returned: 0 in 0.047s execute /openstack/venvs/cinder-13.3.4/lib/python2.7/site-packages/oslo_concurrency/processutils.py:374
2017-07-26 13:35:23.115 6177 DEBUG oslo_concurrency.processutils [req-a974e739-13af-4d2d-b37a-d60231fb2f81 0fe383e9dce146c3add571befdaeb581 feacaa2326204509bd19e395917b47d6 - - -] Running cmd (subprocess): sudo cinder-rootwrap /etc/cinder/rootwrap.conf ionice -c3 dd if=/dev/zero of=/dev/mapper/cinder--volumes-_snapshot--e6d93a29--7ee5--4842--bf59--bc87ea01794e-cow count=100 bs=1M oflag=direct execute /openstack/venvs/cinder-13.3.4/lib/python2.7/site-packages/oslo_concurrency/processutils.py:344
2017-07-26 13:35:23.209 6177 DEBUG oslo_concurrency.processutils [req-a974e739-13af-4d2d-b37a-d60231fb2f81 0fe383e9dce146c3add571befdaeb581 feacaa2326204509bd19e395917b47d6 - - -] CMD "sudo cinder-rootwrap /etc/cinder/rootwrap.conf ionice -c3 dd if=/dev/zero of=/dev/mapper/cinder--volumes-_snapshot--e6d93a29--7ee5--4842--bf59--bc87ea01794e-cow count=100 bs=1M oflag=direct" returned: 0 in 0.094s execute /openstack/venvs/cinder-13.3.4/lib/python2.7/site-packages/oslo_concurrency/processutils.py:374
2017-07-26 13:35:23.209 6177 DEBUG cinder.volume.utils [req-a974e739-13af-4d2d-b37a-d60231fb2f81 0fe383e9dce146c3add571befdaeb581 feacaa2326204509bd19e395917b47d6 - - -] Volume copy details: src /dev/zero, dest /dev/mapper/cinder--volumes-_snapshot--e6d93a29--7ee5--4842--bf59--bc87ea01794e-cow, size 100.00 MB, duration 1.00 sec _copy_volume_with_path /openstack/venvs/cinder-13.3.4/lib/python2.7/site-packages/cinder/volume/utils.py:364
2017-07-26 13:35:23.210 6177 DEBUG oslo_concurrency.processutils [req-a974e739-13af-4d2d-b37a-d60231fb2f81 0fe383e9dce146c3add571befdaeb581 feacaa2326204509bd19e395917b47d6 - - -] Running cmd (subprocess): sudo cinder-rootwrap /etc/cinder/rootwrap.conf lvremove --config activation { retry_deactivation = 1} -f cinder-volumes/_snapshot-e6d93a29-7ee5-4842-bf59-bc87ea01794e execute /openstack/venvs/cinder-13.3.4/lib/python2.7/site-packages/oslo_concurrency/processutils.py:344
2017-07-26 13:35:23.320 6177 DEBUG oslo_concurrency.processutils [req-a974e739-13af-4d2d-b37a-d60231fb2f81 0fe383e9dce146c3add571befdaeb581 feacaa2326204509bd19e395917b47d6 - - -] CMD "sudo cinder-rootwrap /etc/cinder/rootwrap.conf lvremove --config activation { retry_deactivation = 1} -f cinder-volumes/_snapshot-e6d93a29-7ee5-4842-bf59-bc87ea01794e" returned: 5 in 0.110s execute /openstack/venvs/cinder-13.3.4/lib/python2.7/site-packages/oslo_concurrency/processutils.py:374
2017-07-26 13:35:23.321 6177 DEBUG oslo_concurrency.processutils [req-a974e739-13af-4d2d-b37a-d60231fb2f81 0fe383e9dce146c3add571befdaeb581 feacaa2326204509bd19e395917b47d6 - - -] u'sudo cinder-rootwrap /etc/cinder/rootwrap.conf lvremove --config activation { retry_deactivation = 1} -f cinder-volumes/_snapshot-e6d93a29-7ee5-4842-bf59-bc87ea01794e' failed. Not Retrying. execute /openstack/venvs/cinder-13.3.4/lib/python2.7/site-packages/oslo_concurrency/processutils.py:422
2017-07-26 13:35:23.321 6177 DEBUG cinder.brick.local_dev.lvm [req-a974e739-13af-4d2d-b37a-d60231fb2f81 0fe383e9dce146c3add571befdaeb581 feacaa2326204509bd19e395917b47d6 - - -] Error reported running lvremove: CMD: sudo cinder-rootwrap /etc/cinder/rootwrap.conf lvremove --config activation { retry_deactivation = 1} -f cinder-volumes/_snapshot-e6d93a29-7ee5-4842-bf59-bc87ea01794e, RESPONSE: Unable to deactivate open cinder--volumes-_snapshot--e6d93a29--7ee5--4842--bf59--bc87ea01794e-cow (252:310)
  Failed to resume _snapshot-e6d93a29-7ee5-4842-bf59-bc87ea01794e.
2017-07-26 13:35:23.415 6177 DEBUG oslo_concurrency.processutils [req-a974e739-13af-4d2d-b37a-d60231fb2f81 0fe383e9dce146c3add571befdaeb581 feacaa2326204509bd19e395917b47d6 - - -] Running cmd (subprocess): sudo cinder-rootwrap /etc/cinder/rootwrap.conf lvremove --config activation { retry_deactivation = 1} devices { ignore_suspended_devices = 1} -f cinder-volumes/_snapshot-e6d93a29-7ee5-4842-bf59-bc87ea01794e execute /openstack/venvs/cinder-13.3.4/lib/python2.7/site-packages/oslo_concurrency/processutils.py:344
2017-07-26 13:35:23.505 6177 DEBUG oslo_concurrency.processutils [req-a974e739-13af-4d2d-b37a-d60231fb2f81 0fe383e9dce146c3add571befdaeb581 feacaa2326204509bd19e395917b47d6 - - -] CMD "sudo cinder-rootwrap /etc/cinder/rootwrap.conf lvremove --config activation { retry_deactivation = 1} devices { ignore_suspended_devices = 1} -f cinder-volumes/_snapshot-e6d93a29-7ee5-4842-bf59-bc87ea01794e" returned: 0 in 0.090s execute /openstack/venvs/cinder-13.3.4/lib/python2.7/site-packages/oslo_concurrency/processutils.py:374
2017-07-26 13:35:23.506 6177 DEBUG cinder.brick.local_dev.lvm [req-a974e739-13af-4d2d-b37a-d60231fb2f81 0fe383e9dce146c3add571befdaeb581 feacaa2326204509bd19e395917b47d6 - - -] Successfully deleted volume: _snapshot-e6d93a29-7ee5-4842-bf59-bc87ea01794e after udev settle. delete /openstack/venvs/cinder-13.3.4/lib/python2.7/site-packages/cinder/brick/local_dev/lvm.py:703
2017-07-26 13:35:23.599 6177 DEBUG oslo_concurrency.lockutils [req-a974e739-13af-4d2d-b37a-d60231fb2f81 0fe383e9dce146c3add571befdaeb581 feacaa2326204509bd19e395917b47d6 - - -] Lock "e6d93a29-7ee5-4842-bf59-bc87ea01794e-delete_snapshot" released by "cinder.volume.manager.lso_inner2" :: held 0.792s inner /openstack/venvs/cinder-13.3.4/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:282

lvdisplay /dev/cinder-volumes/volume-a67e3fdb-b28a-463c-b3f2-b78c6cecddc8
  --- Logical volume ---
  LV Path /dev/cinder-volumes/volume-a67e3fdb-b28a-463c-b3f2-b78c6cecddc8
  LV Name volume-a67e3fdb-b28a-463c-b3f2-b78c6cecddc8
  VG Name cinder-volumes
  LV UUID tRKy2J-F6SV-gVD1-jbYP-DWBC-e31O-dDP3Vf
  LV Write Access read/write
  LV Creation host, time 99999-cinder24, 2017-03-08 13:55:53 +0000
  LV snapshot status source of
                         _snapshot-ba0d145c-f1e6-44e1-8bb4-ea17e4813796 [active]
                         _snapshot-6ed1056a-71c0-4804-b0eb-fa8eef2a74e3 [active]
                         _snapshot-3fd389a0-b970-4597-bf86-df3424dba54d [active]
                         _snapshot-c8c7a872-8da2-4bcc-9555-3945eb2f6485 [active]
                         _snapshot-f0cc84d6-7cee-44f4-aaae-19c2adc121b1 [active]
  LV Status suspended
  # open 1
  LV Size 50.00 GiB
  Current LE 12800
  Segments 1
  Allocation inherit
  Read ahead sectors auto
  - currently set to 256
  Block device 252:100

    Wiping internal VG cache

Tags: drivers lvm
Revision history for this message
Matthew Heler (rackspace.matt) wrote :

This OpenStack environment is running Mitaka.

tags: added: drivers lvm
To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.