Seen here:
http://logs.openstack.xenproject.org/35/246635/1/check/dsvm-tempest-xen/7bfd07c/console.html
I see this in the n-cpu logs:
2015-11-17 23:16:50.731 DEBUG nova.compute.manager [req-320c876b-00a2-4785-a9a4-3979d567302e tempest-ImagesOneServerTestJSON-1384909178 tempest-ImagesOneServerTestJSON-1707465920] [instance: fba53658-7192-4c2e-a7f7-8f7e186da13b] Cleaning up image 085d91cf-d1a2-412c-bbcb-aca06ac5bd29 decorated_function /opt/stack/new/nova/nova/compute/manager.py:420
2015-11-17 23:16:50.731 22089 ERROR nova.compute.manager [instance: fba53658-7192-4c2e-a7f7-8f7e186da13b] Traceback (most recent call last):
2015-11-17 23:16:50.731 22089 ERROR nova.compute.manager [instance: fba53658-7192-4c2e-a7f7-8f7e186da13b] File "/opt/stack/new/nova/nova/compute/manager.py", line 416, in decorated_function
2015-11-17 23:16:50.731 22089 ERROR nova.compute.manager [instance: fba53658-7192-4c2e-a7f7-8f7e186da13b] *args, **kwargs)
2015-11-17 23:16:50.731 22089 ERROR nova.compute.manager [instance: fba53658-7192-4c2e-a7f7-8f7e186da13b] File "/opt/stack/new/nova/nova/compute/manager.py", line 3046, in snapshot_instance
2015-11-17 23:16:50.731 22089 ERROR nova.compute.manager [instance: fba53658-7192-4c2e-a7f7-8f7e186da13b] task_states.IMAGE_SNAPSHOT)
2015-11-17 23:16:50.731 22089 ERROR nova.compute.manager [instance: fba53658-7192-4c2e-a7f7-8f7e186da13b] File "/opt/stack/new/nova/nova/compute/manager.py", line 3076, in _snapshot_instance
2015-11-17 23:16:50.731 22089 ERROR nova.compute.manager [instance: fba53658-7192-4c2e-a7f7-8f7e186da13b] update_task_state)
2015-11-17 23:16:50.731 22089 ERROR nova.compute.manager [instance: fba53658-7192-4c2e-a7f7-8f7e186da13b] File "/opt/stack/new/nova/nova/virt/libvirt/driver.py", line 1405, in snapshot
2015-11-17 23:16:50.731 22089 ERROR nova.compute.manager [instance: fba53658-7192-4c2e-a7f7-8f7e186da13b] guest.save_memory_state()
2015-11-17 23:16:50.731 22089 ERROR nova.compute.manager [instance: fba53658-7192-4c2e-a7f7-8f7e186da13b] File "/opt/stack/new/nova/nova/virt/libvirt/guest.py", line 313, in save_memory_state
2015-11-17 23:16:50.731 22089 ERROR nova.compute.manager [instance: fba53658-7192-4c2e-a7f7-8f7e186da13b] self._domain.managedSave(0)
2015-11-17 23:16:50.731 22089 ERROR nova.compute.manager [instance: fba53658-7192-4c2e-a7f7-8f7e186da13b] File "/usr/local/lib/python2.7/dist-packages/eventlet/tpool.py", line 183, in doit
2015-11-17 23:16:50.731 22089 ERROR nova.compute.manager [instance: fba53658-7192-4c2e-a7f7-8f7e186da13b] result = proxy_call(self._autowrap, f, *args, **kwargs)
2015-11-17 23:16:50.731 22089 ERROR nova.compute.manager [instance: fba53658-7192-4c2e-a7f7-8f7e186da13b] File "/usr/local/lib/python2.7/dist-packages/eventlet/tpool.py", line 141, in proxy_call
2015-11-17 23:16:50.731 22089 ERROR nova.compute.manager [instance: fba53658-7192-4c2e-a7f7-8f7e186da13b] rv = execute(f, *args, **kwargs)
2015-11-17 23:16:50.731 22089 ERROR nova.compute.manager [instance: fba53658-7192-4c2e-a7f7-8f7e186da13b] File "/usr/local/lib/python2.7/dist-packages/eventlet/tpool.py", line 122, in execute
2015-11-17 23:16:50.731 22089 ERROR nova.compute.manager [instance: fba53658-7192-4c2e-a7f7-8f7e186da13b] six.reraise(c, e, tb)
2015-11-17 23:16:50.731 22089 ERROR nova.compute.manager [instance: fba53658-7192-4c2e-a7f7-8f7e186da13b] File "/usr/local/lib/python2.7/dist-packages/eventlet/tpool.py", line 80, in tworker
2015-11-17 23:16:50.731 22089 ERROR nova.compute.manager [instance: fba53658-7192-4c2e-a7f7-8f7e186da13b] rv = meth(*args, **kwargs)
2015-11-17 23:16:50.731 22089 ERROR nova.compute.manager [instance: fba53658-7192-4c2e-a7f7-8f7e186da13b] File "/usr/local/lib/python2.7/dist-packages/libvirt.py", line 1397, in managedSave
2015-11-17 23:16:50.731 22089 ERROR nova.compute.manager [instance: fba53658-7192-4c2e-a7f7-8f7e186da13b] if ret == -1: raise libvirtError ('virDomainManagedSave() failed', dom=self)
2015-11-17 23:16:50.731 22089 ERROR nova.compute.manager [instance: fba53658-7192-4c2e-a7f7-8f7e186da13b] libvirtError: internal error: Failed to save domain '42' with libxenlight
2015-11-17 23:16:50.731 22089 ERROR nova.compute.manager [instance: fba53658-7192-4c2e-a7f7-8f7e186da13b]
2015-11-17 23:16:51.094 ERROR oslo_messaging.rpc.dispatcher [req-320c876b-00a2-4785-a9a4-3979d567302e tempest-ImagesOneServerTestJSON-1384909178 tempest-ImagesOneServerTestJSON-1707465920] Exception during message handling: internal error: Failed to save domain '42' with libxenlight
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher Traceback (most recent call last):
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 142, in _dispatch_and_reply
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher executor_callback))
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 186, in _dispatch
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher executor_callback)
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 129, in _do_dispatch
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher result = func(ctxt, **new_args)
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/new/nova/nova/exception.py", line 105, in wrapped
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher payload)
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher File "/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 197, in __exit__
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher six.reraise(self.type_, self.value, self.tb)
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/new/nova/nova/exception.py", line 88, in wrapped
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher return f(self, context, *args, **kw)
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/new/nova/nova/compute/manager.py", line 350, in decorated_function
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher LOG.warning(msg, e, instance=instance)
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher File "/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 197, in __exit__
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher six.reraise(self.type_, self.value, self.tb)
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/new/nova/nova/compute/manager.py", line 323, in decorated_function
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher return function(self, context, *args, **kwargs)
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/new/nova/nova/compute/manager.py", line 378, in decorated_function
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher kwargs['instance'], e, sys.exc_info())
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher File "/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 197, in __exit__
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher six.reraise(self.type_, self.value, self.tb)
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/new/nova/nova/compute/manager.py", line 366, in decorated_function
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher return function(self, context, *args, **kwargs)
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/new/nova/nova/compute/manager.py", line 426, in decorated_function
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher instance=instance)
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher File "/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 197, in __exit__
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher six.reraise(self.type_, self.value, self.tb)
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/new/nova/nova/compute/manager.py", line 416, in decorated_function
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher *args, **kwargs)
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/new/nova/nova/compute/manager.py", line 3046, in snapshot_instance
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher task_states.IMAGE_SNAPSHOT)
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/new/nova/nova/compute/manager.py", line 3076, in _snapshot_instance
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher update_task_state)
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/new/nova/nova/virt/libvirt/driver.py", line 1405, in snapshot
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher guest.save_memory_state()
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/new/nova/nova/virt/libvirt/guest.py", line 313, in save_memory_state
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher self._domain.managedSave(0)
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher File "/usr/local/lib/python2.7/dist-packages/eventlet/tpool.py", line 183, in doit
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher result = proxy_call(self._autowrap, f, *args, **kwargs)
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher File "/usr/local/lib/python2.7/dist-packages/eventlet/tpool.py", line 141, in proxy_call
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher rv = execute(f, *args, **kwargs)
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher File "/usr/local/lib/python2.7/dist-packages/eventlet/tpool.py", line 122, in execute
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher six.reraise(c, e, tb)
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher File "/usr/local/lib/python2.7/dist-packages/eventlet/tpool.py", line 80, in tworker
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher rv = meth(*args, **kwargs)
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher File "/usr/local/lib/python2.7/dist-packages/libvirt.py", line 1397, in managedSave
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher if ret == -1: raise libvirtError ('virDomainManagedSave() failed', dom=self)
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher libvirtError: internal error: Failed to save domain '42' with libxenlight
2015-11-17 23:16:51.094 22089 ERROR oslo_messaging.rpc.dispatcher
More about this bugs, in the libvirt/libxl logs, there is this: c:1074: libxl__ domain_ suspend_ common_ callback: issuing PV suspend request via XenBus control node c:1078: libxl__ domain_ suspend_ common_ callback: wait for the guest to acknowledge suspend request c:1125: libxl__ domain_ suspend_ common_ callback: guest acknowledged suspend request c:1129: libxl__ domain_ suspend_ common_ callback: wait for the guest to suspend c:1151: libxl__ domain_ suspend_ common_ callback: guest did not suspend c:1406: libxl__ xc_domain_ save_done: saving domain: domain responded to suspend request: Success
(starting at 2015-11-17 23:16:44 UTC)
xc: detail: xc_domain_save: starting save of domid 42
libxl: debug: libxl_dom.
libxl: debug: libxl_dom.
libxl: debug: libxl_dom.
libxl: debug: libxl_dom.
libxl: error: libxl_dom.
xc: error: Suspend request failed: Internal error
xc: error: Domain appears not to have suspended: Internal error
xc: detail: Save exit of domid 42 with rc=1
libxl-save-helper: debug: complete r=1: Success
libxl: error: libxl_dom.
And I think it is Linux itself that fail to suspend. I been trying to debug that for sometime, but so far, the main thing I've done is update the CirrOS image to include a newer kernel, and have the CI print the guest console when suspend fail. It appear that Linux fail to suspend if a suspend request arrive before it is fully booted.