Tempest test "test_manage_share_duplicate" fails sporadically

Bug #1848608 reported by Goutham Pacha Ravi on 2019-10-17
6
This bug affects 1 person
Affects Status Importance Assigned to Milestone
Manila
Medium
Douglas Viroel

Bug Description

This failure occurred on the Dummy driver's driver_handles_share_servers=False gate job in stable/rocky

Change that failed the job: https://review.opendev.org/#/c/688542/1
Logs: https://zuul.opendev.org/t/openstack/build/c34159d5c16d40dca18646ec51a2c944
(Relevant logs are attached with this bug since they will disappear from the log server eventually)

Failure:
{1} manila_tempest_tests.tests.api.admin.test_share_manage_negative.ManageNFSShareNegativeTest.test_manage_share_duplicate [16.104773s] ... FAILED

 Captured traceback:
 ~~~~~~~~~~~~~~~~~~~
     Traceback (most recent call last):
       File "/opt/stack/new/manila-tempest-plugin/manila_tempest_tests/tests/api/admin/test_share_manage_negative.py", line 202, in test_manage_share_duplicate
         **manage_params
       File "/usr/local/lib/python2.7/dist-packages/testtools/testcase.py", line 485, in assertRaises
         self.assertThat(our_callable, matcher)
       File "/usr/local/lib/python2.7/dist-packages/testtools/testcase.py", line 498, in assertThat
         raise mismatch_error
     testtools.matchers._impl.MismatchError: <bound method SharesV2Client.manage_share of <manila_tempest_tests.services.share.v2.json.shares_client.SharesV2Client object at 0x7fdf61a52b50>> returned {u'share_type_name': u'tempest-manage-st-name-1002350983', u'links': [{u'href': u'https://192.168.48.33:8786/v2/e4b0ed51fc5a4cfa9f7f1150eeda1d9d/shares/863cd0cf-10ce-4a00-ba1e-04c8f9eb8485', u'rel': u'self'}, {u'href': u'https://192.168.48.33:8786/e4b0ed51fc5a4cfa9f7f1150eeda1d9d/shares/863cd0cf-10ce-4a00-ba1e-04c8f9eb8485', u'rel': u'bookmark'}], u'availability_zone': None, u'share_network_id': None, u'share_server_id': None, u'snapshot_id': None, u'id': u'863cd0cf-10ce-4a00-ba1e-04c8f9eb8485', u'size': None, u'user_id': u'693a334256a245bf92d4e51924c7b989', u'share_type': u'3564fcf9-a40d-4c35-b763-17d81fb866dc', u'project_id': u'e4b0ed51fc5a4cfa9f7f1150eeda1d9d', u'metadata': {}, u'status': u'manage_starting', u'description': None, u'share_group_id': None, u'host': u'ubuntu-xenial-fortnebula-regionone-0012364087@gamma#fake_pool_for_GAMMA', u'revert_to_snapshot_support': False, u'access_rules_status': u'active', u'create_share_from_snapshot_support': False, u'is_public': False, u'task_state': None, u'snapshot_support': True, u'source_share_group_snapshot_member_id': None, u'name': None, u'has_replicas': False, u'replication_type': None, u'created_at': u'2019-10-17T17:52:33.000000', u'share_proto': u'NFS', u'volume_type': u'tempest-manage-st-name-1002350983', u'mount_snapshot_support': False}

Goutham Pacha Ravi (gouthamr) wrote :
Goutham Pacha Ravi (gouthamr) wrote :
Goutham Pacha Ravi (gouthamr) wrote :
Goutham Pacha Ravi (gouthamr) wrote :
Goutham Pacha Ravi (gouthamr) wrote :
Douglas Viroel (dviroel) wrote :

The log shows that the manage_share requests are slightly different:
The first request has the export_path: "export_path": "10.0.0.10:/path/to/fake/share/share_d9646569_6e03_475c_b382_76242d73c9d1_d3b7c0c2_dac1_4cd6_88af_a4e3de8523d9"
The second request has a different export_path: export_path": "10.0.0.20:/path/to/fake/share/share_d01ca7ad_40ac_42e7_818b_8758054785c9_cf0481e7_4aa7_4888_aad0_60eeae208b56"
The exception expected by the test should be raised by a conflict found on share creation and validated here: https://github.com/openstack/manila/blob/bffeef11b4aea4a9e718fb7814336149b1d2abfd/manila/share/api.py#L618

The issue happens when the export location list has more than one element and its elements are retrieved in a different order.

Goutham Pacha Ravi (gouthamr) wrote :

Ah! Nice find; so this is a day-0 bug (and not specific to the test in question, or the stable/rocky branch); it appears it's been around as long as we've allowed multiple export locations on a share. Thanks for the analysis, Douglas. Please feel free to own this bug.

Changed in manila:
importance: Undecided → Medium
status: New → Confirmed
status: Confirmed → New

Fix proposed to branch: master
Review: https://review.opendev.org/689283

Changed in manila:
assignee: nobody → Douglas Viroel (dviroel)
status: New → In Progress
Goutham Pacha Ravi (gouthamr) wrote :

Douglas, looking at the logs a bit more, I feel the problem is in manila's API and not the tests.

In the logs uploaded to this bug, the scenario is as follows:

* Share A is created and one of its export locations is noted
* Share A is unmanaged
* Share A is managed again as A', and one of its new export locations is noted
* Using this new export location, the test attempts to manage share A' again, expecting a failure

On the third step of this test, when export locations of A' are listed, we see this list:

"export_locations": [
  {"path": "10.0.0.20:/path/to/fake/share/share_d01ca7ad_40ac_42e7_818b_8758054785c9_cf0481e7_4aa7_4888_aad0_60eeae208b56",
   "share_instance_id": "cf0481e7-4aa7-4888-aad0-60eeae208b56", "is_admin_only": false, "id": "96f467ac-a4c5-4d8b-9811-b0c4ecfcb9a2", "preferred": false},
  {"path": "11.0.0.11:/path/to/fake/share/share_d01ca7ad_40ac_42e7_818b_8758054785c9_cf0481e7_4aa7_4888_aad0_60eeae208b56",
   "share_instance_id": "cf0481e7-4aa7-4888-aad0-60eeae208b56", "is_admin_only": true, "id": "9ba2c66e-a536-4969-bb12-75b9eac672c0", "preferred": false},
  {"path": "10.0.0.10:/path/to/fake/share/share_d01ca7ad_40ac_42e7_818b_8758054785c9_cf0481e7_4aa7_4888_aad0_60eeae208b56",
   "share_instance_id": "cf0481e7-4aa7-4888-aad0-60eeae208b56", "is_admin_only": false, "id": "82e060b5-1da4-4642-8881-d1bc1da8482a", "preferred": true}
]

The test uses the first of these to attempt the manage operation again:

"export_path": "10.0.0.20:/path/to/fake/share/share_d01ca7ad_40ac_42e7_818b_8758054785c9_cf0481e7_4aa7_4888_aad0_60eeae208b56",

While a failure is expected, due to [1]; the request doesn't fail somehow.

[1] https://opendev.org/openstack/manila/src/commit/b6b2eb5cf3a32e506af8734acecb4a34ceafc982/manila/share/api.py#L724-L728

Goutham Pacha Ravi (gouthamr) wrote :

The retrieval logic for search options looks suspicious:

https://opendev.org/openstack/manila/src/commit/b6b2eb5cf3a32e506af8734acecb4a34ceafc982/manila/share/api.py#L1709-L1718

We attempt to retrieve the export location with the key "export_location" on the Share model (https://opendev.org/openstack/manila/src/commit/b6b2eb5cf3a32e506af8734acecb4a34ceafc982/manila/share/api.py#L671), but in reality, there is no such key on the database schema.

There's a meta property on the share model, which just fetches the share's instance's first export location:

- https://opendev.org/openstack/manila/src/commit/b6b2eb5cf3a32e506af8734acecb4a34ceafc982/manila/db/sqlalchemy/models.py#L206-L208

- https://opendev.org/openstack/manila/src/commit/b6b2eb5cf3a32e506af8734acecb4a34ceafc982/manila/db/sqlalchemy/models.py#L353-L355

I think the problem is here; instead of trying to use a deprecated property (export_location), we should convert that key into "export_location_path" and that should fix our issue.

Changed in manila:
milestone: none → ussuri-1
tags: added: backport-potential
To post a comment you must log in.
This report contains Public information  Edit
Everyone can see this information.

Other bug subscribers