Pure Storage - Cloning multiple volumes failure (PowerVC)

Bug #1938579 reported by Simon Dodsley
6
This bug affects 1 person
Affects Status Importance Assigned to Milestone
Cinder
Fix Released
Undecided
Simon Dodsley

Bug Description

While performing a clone of multiple volumes in PowerVC 2.0 a failure occurs:

Below is its stack trace

2021-07-28 10:42:38.538 4089805 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
2021-07-28 10:42:38.538 4089805 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming
2021-07-28 10:42:38.538 4089805 ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message)
2021-07-28 10:42:38.538 4089805 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch
2021-07-28 10:42:38.538 4089805 ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args)
2021-07-28 10:42:38.538 4089805 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch
2021-07-28 10:42:38.538 4089805 ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args)
2021-07-28 10:42:38.538 4089805 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/cinder/volume/manager.py", line 3457, in create_group_from_src
2021-07-28 10:42:38.538 4089805 ERROR oslo_messaging.rpc.server vol.save()
2021-07-28 10:42:38.538 4089805 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 227, in __exit__
2021-07-28 10:42:38.538 4089805 ERROR oslo_messaging.rpc.server self.force_reraise()
2021-07-28 10:42:38.538 4089805 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
2021-07-28 10:42:38.538 4089805 ERROR oslo_messaging.rpc.server raise self.value
2021-07-28 10:42:38.538 4089805 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/cinder/volume/manager.py", line 3409, in create_group_from_src
2021-07-28 10:42:38.538 4089805 ERROR oslo_messaging.rpc.server sorted_snapshots, source_group, sorted_source_vols))
2021-07-28 10:42:38.538 4089805 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/cinder/volume/drivers/pure_powervc.py", line 314, in create_group_from_src
2021-07-28 10:42:38.538 4089805 ERROR oslo_messaging.rpc.server self._create_restricted_metadata(volume, model_update)
2021-07-28 10:42:38.538 4089805 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/cinder/volume/drivers/pure_powervc.py", line 272, in _create_restricted_metadata
2021-07-28 10:42:38.538 4089805 ERROR oslo_messaging.rpc.server vdisk = array.get_volume(volume_obj['provider_id'])
2021-07-28 10:42:38.538 4089805 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/purestorage/purestorage.py", line 679, in get_volume
2021-07-28 10:42:38.538 4089805 ERROR oslo_messaging.rpc.server return self._request("GET", "volume/{0}".format(volume), kwargs)
2021-07-28 10:42:38.538 4089805 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/cinder/volume/drivers/pure.py", line 1622, in wrapper
2021-07-28 10:42:38.538 4089805 ERROR oslo_messaging.rpc.server ret = fn(*args, **kwargs)
2021-07-28 10:42:38.538 4089805 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.6/site-packages/purestorage/purestorage.py", line 203, in _request
2021-07-28 10:42:38.538 4089805 ERROR oslo_messaging.rpc.server raise PureHTTPError(self._target, str(self._rest_version), response)
2021-07-28 10:42:38.538 4089805 ERROR oslo_messaging.rpc.server purestorage.purestorage.PureHTTPError: PureHTTPError status code 400 returned by REST version 1.17 at 9.3.250.85: BAD REQUEST
2021-07-28 10:42:38.538 4089805 ERROR oslo_messaging.rpc.server [{"msg": "Volume does not exist.", "ctx": "None"}]
2021-07-28 10:42:38.538 4089805 ERROR oslo_messaging.rpc.server

This only happens in PowerVC as it comes from a customized PowerVC module but causes the crash in the core Pure Storage Cinder driver.

The actual reason for the failure was the volume object passed to get_volume through _create_restricted_metadata did not contain a provider_id.

2021-07-28 10:42:38.257 4089805 INFO cinder.volume.drivers.pure_powervc [req-b837a7b4-c49a-4f73-9189-a828b4bfff99 2a2f425c3d694ba3808723e167c2e9f3 6a560e186bcb49dfad3f33e6fce52f36 - - -] jfr volume is Volume(_name_id=None,admin_metadata={},attach_status='detached',availability_zone='nova',bootable=False,cluster=<?>,cluster_name=None,consistencygroup=<?>,consistencygroup_id=None,created_at=2021-07-28T14:42:35Z,deleted=False,deleted_at=None,display_description=None,display_name='clone-test-1',ec2_id=None,encryption_key_id=None,glance_metadata=<?>,group=<?>,group_id=70cace57-1ebb-497e-8a40-4506a1dbfcf9,host='pvc75#None',id=0a914f0c-3d9f-49eb-a666-cf240d23b059,launched_at=None,metadata={},migration_status=None,multiattach=False,previous_status=None,project_id='6a560e186bcb49dfad3f33e6fce52f36',provider_auth=None,provider_geometry=None,provider_id=None,provider_location=None,replication_driver_data=None,replication_extended_status=None,replication_status=None,scheduled_at=2021-07-28T14:42:36Z,service_uuid=None,shared_targets=True,size=1,snapshot_id=None,snapshots=<?>,source_volid=3679e245-7196-48b4-be11-f15bb9658781,status='creating',terminated_at=None,updated_at=2021-07-28T14:42:36Z,user_id='2a2f425c3d694ba3808723e167c2e9f3',volume_attachment=VolumeAttachmentList,volume_type=VolumeType(d9f7e4d8-ad2c-49c8-87c8-16924414fc39),volume_type_id=d9f7e4d8-ad2c-49c8-87c8-16924414fc39)

Changed in cinder:
assignee: nobody → Simon Dodsley (simon-dodsley)
status: New → In Progress
Revision history for this message
OpenStack Infra (hudson-openstack) wrote : Fix proposed to cinder (master)

Fix proposed to branch: master
Review: https://review.opendev.org/c/openstack/cinder/+/803046

Revision history for this message
OpenStack Infra (hudson-openstack) wrote : Fix merged to cinder (master)

Reviewed: https://review.opendev.org/c/openstack/cinder/+/803046
Committed: https://opendev.org/openstack/cinder/commit/f67eae51f93ea05e9cd8f956e5aef2b3bf84d7db
Submitter: "Zuul (22348)"
Branch: master

commit f67eae51f93ea05e9cd8f956e5aef2b3bf84d7db
Author: Simon Dodsley <email address hidden>
Date: Fri Jul 30 10:36:56 2021 -0400

    [Pure Storage] Resolve missing provider_id issue (PowerVC)

    In PowerVC, cloning of multiple volumes fails due to the
    provider_id not being populated correctly for a volume.

    Change-Id: Ib08a0561aa404a7aee41b23ec5d30f9f52096687
    Closes-Bug: #1938579

Changed in cinder:
status: In Progress → Fix Released
Revision history for this message
OpenStack Infra (hudson-openstack) wrote : Fix proposed to cinder (stable/wallaby)

Fix proposed to branch: stable/wallaby
Review: https://review.opendev.org/c/openstack/cinder/+/804345

Revision history for this message
OpenStack Infra (hudson-openstack) wrote : Fix merged to cinder (stable/wallaby)

Reviewed: https://review.opendev.org/c/openstack/cinder/+/804345
Committed: https://opendev.org/openstack/cinder/commit/13e55d8ffc2d3b77925c950880acc36faf884178
Submitter: "Zuul (22348)"
Branch: stable/wallaby

commit 13e55d8ffc2d3b77925c950880acc36faf884178
Author: Simon Dodsley <email address hidden>
Date: Fri Jul 30 10:36:56 2021 -0400

    [Pure Storage] Resolve missing provider_id issue (PowerVC)

    In PowerVC, cloning of multiple volumes fails due to the
    provider_id not being populated correctly for a volume.

    Change-Id: Ib08a0561aa404a7aee41b23ec5d30f9f52096687
    Closes-Bug: #1938579
    (cherry picked from commit f67eae51f93ea05e9cd8f956e5aef2b3bf84d7db)

tags: added: in-stable-wallaby
Revision history for this message
OpenStack Infra (hudson-openstack) wrote : Fix included in openstack/cinder 18.1.0

This issue was fixed in the openstack/cinder 18.1.0 release.

Revision history for this message
OpenStack Infra (hudson-openstack) wrote : Fix included in openstack/cinder 19.0.0.0rc1

This issue was fixed in the openstack/cinder 19.0.0.0rc1 release candidate.

To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.