tempest.api.compute.v3.admin.test_servers.ServersAdminV3TestJSON.test_list_servers_by_admin_with_all_tenants FAIL due to Infocache failure in nova conductor

Bug #1269687 reported by Lianhao Lu
6
This bug affects 1 person
Affects Status Importance Assigned to Milestone
OpenStack Compute (nova)
New
Undecided
Unassigned

Bug Description

Saw the following tempest failure on Jenkins:

2014-01-16 05:23:43.064 | ======================================================================
2014-01-16 05:23:43.064 | FAIL: tempest.api.compute.v3.admin.test_servers.ServersAdminV3TestJSON.test_list_servers_by_admin_with_all_tenants[gate]
2014-01-16 05:23:43.064 | tempest.api.compute.v3.admin.test_servers.ServersAdminV3TestJSON.test_list_servers_by_admin_with_all_tenants[gate]
2014-01-16 05:23:43.064 | ----------------------------------------------------------------------
2014-01-16 05:23:43.064 | _StringException: Empty attachments:
2014-01-16 05:23:43.064 | stderr
2014-01-16 05:23:43.064 | stdout
2014-01-16 05:23:43.064 |
2014-01-16 05:23:43.064 | pythonlogging:'': {{{
2014-01-16 05:23:43.065 | 2014-01-16 04:55:18,202 Request: GET http://127.0.0.1:8774/v3/servers/detail?all_tenants=
2014-01-16 05:23:43.065 | 2014-01-16 04:55:18,203 Request Headers: {'X-Auth-Token': '<Token omitted>'}
2014-01-16 05:23:43.065 | 2014-01-16 04:55:18,771 Response Status: 404
2014-01-16 05:23:43.065 | 2014-01-16 04:55:18,771 Nova request id: req-f594f164-3ce2-4f15-b3cd-d03864586c59
2014-01-16 05:23:43.065 | 2014-01-16 04:55:18,771 Response Headers: {'content-length': '78', 'date': 'Thu, 16 Jan 2014 04:55:18 GMT', 'content-type': 'application/json; charset=UTF-8', 'connection': 'close'}
2014-01-16 05:23:43.065 | 2014-01-16 04:55:18,772 Response Body: {"itemNotFound": {"message": "The resource could not be found.", "code": 404}}
2014-01-16 05:23:43.065 | }}}
2014-01-16 05:23:43.065 |
2014-01-16 05:23:43.065 | Traceback (most recent call last):
2014-01-16 05:23:43.065 | File "tempest/api/compute/v3/admin/test_servers.py", line 72, in test_list_servers_by_admin_with_all_tenants
2014-01-16 05:23:43.065 | resp, body = self.client.list_servers_with_detail(params)
2014-01-16 05:23:43.065 | File "tempest/services/compute/v3/json/servers_client.py", line 159, in list_servers_with_detail
2014-01-16 05:23:43.066 | resp, body = self.get(url)
2014-01-16 05:23:43.066 | File "tempest/common/rest_client.py", line 305, in get
2014-01-16 05:23:43.066 | return self.request('GET', url, headers)
2014-01-16 05:23:43.066 | File "tempest/common/rest_client.py", line 436, in request
2014-01-16 05:23:43.066 | resp, resp_body)
2014-01-16 05:23:43.066 | File "tempest/common/rest_client.py", line 481, in _error_checker
2014-01-16 05:23:43.066 | raise exceptions.NotFound(resp_body)
2014-01-16 05:23:43.066 | NotFound: Object not found
2014-01-16 05:23:43.066 | Details: {"itemNotFound": {"message": "The resource could not be found.", "code": 404}}
2014-01-16 05:23:43.066 |
2014-01-16 05:23:43.066 |
2014-01-16 05:23:43.066 | ======================================================================
2014-01-16 05:23:43.067 | FAIL: process-returncode
2014-01-16 05:23:43.067 | process-returncode
2014-01-16 05:23:43.067 | ----------------------------------------------------------------------

The corresponding failure output on nova-compute is listed as below:

2014-01-16 04:55:16.562 29793 ERROR nova.virt.driver [-] Exception dispatching event <nova.virt.event.LifecycleEvent object at 0x3e2f610>: Info cache for instance eaf69096-c88d-4353-80b3-faaf0e0c064b could not be found.
Traceback (most recent call last):

  File "/opt/stack/new/nova/nova/conductor/manager.py", line 576, in _object_dispatch
    return getattr(target, method)(context, *args, **kwargs)

  File "/opt/stack/new/nova/nova/objects/base.py", line 152, in wrapper
    return fn(self, ctxt, *args, **kwargs)

  File "/opt/stack/new/nova/nova/objects/instance.py", line 486, in refresh
    self.info_cache.refresh()

  File "/opt/stack/new/nova/nova/objects/base.py", line 152, in wrapper
    return fn(self, ctxt, *args, **kwargs)

  File "/opt/stack/new/nova/nova/objects/instance_info_cache.py", line 103, in refresh
    self.instance_uuid)

  File "/opt/stack/new/nova/nova/objects/base.py", line 112, in wrapper
    result = fn(cls, context, *args, **kwargs)

  File "/opt/stack/new/nova/nova/objects/instance_info_cache.py", line 70, in get_by_instance_uuid
    instance_uuid=instance_uuid)

InstanceInfoCacheNotFound: Info cache for instance eaf69096-c88d-4353-80b3-faaf0e0c064b could not be found.

2014-01-16 04:55:16.563+0000: 32352: debug : virEventRunDefaultImpl:244 : running default event implementation
2014-01-16 04:55:16.563+0000: 32352: debug : virEventPollCleanupTimeouts:506 : Cleanup 2
2014-01-16 04:55:16.563+0000: 32352: debug : virEventPollCleanupHandles:554 : Cleanup 2
2014-01-16 04:55:16.563+0000: 32352: debug : virEventPollMakePollFDs:383 : Prepare n=0 w=1, f=8 e=1 d=0
2014-01-16 04:55:16.563+0000: 32352: debug : virEventPollMakePollFDs:383 : Prepare n=1 w=2, f=12 e=1 d=0
2014-01-16 04:55:16.563+0000: 32352: debug : virEventPollCalculateTimeout:325 : Calculate expiry of 2 timers
2014-01-16 04:55:16.563+0000: 32352: debug : virEventPollCalculateTimeout:351 : Timeout at 0 due in -1 ms
2014-01-16 04:55:16.563+0000: 32352: debug : virEventPollRunOnce:619 : EVENT_POLL_RUN: nhandles=2 imeout=-1

Revision history for this message
Lianhao Lu (lianhao-lu) wrote :
Revision history for this message
Matt Riedemann (mriedem) wrote :

Is this related to bug 1266919 (or maybe a duplicate)? There is already a fix for that, and it also shows up on successful runs so maybe not the root issue here?

I'm inclined to think this is something on it's own.

tags: added: testing unified-objects
Revision history for this message
Matt Riedemann (mriedem) wrote :

I'm thinking this is a duplicate of bug 1258620 which Aaron Rosen is working for icehouse-rc1.

Dan Smith (danms)
tags: removed: unified-objects
Matt Riedemann (mriedem)
tags: added: network
To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.