Activity log for bug #1745529

Date Who What changed Old value New value Message
2018-01-26 02:29:30 Brin Zhang bug added bug
2018-01-26 02:30:52 Brin Zhang description You can use the following steps to reproduce(This problem also exists in the Ocata version): 1.nova pause 5e824732-c557-434f-9902-a6d9abce7bda 2.nova list +--------------------------------------+-------+--------+------------+-------------+---------------------+ | ID | Name | Status | Task State | Power State | Networks | +--------------------------------------+-------+--------+------------+-------------+---------------------+ | 5e824732-c557-434f-9902-a6d9abce7bda | test | PAUSED | - | Paused | network=192.168.0.9 | +--------------------------------------+-------+--------+------------+-------------+---------------------+ 3.nova shelve 5e824732-c557-434f-9902-a6d9abce7bda Error log is as follows: -- Logs begin at Tue 2018-01-23 17:20:38 EST. -- Jan 26 10:12:46 localhost.localdomain nova-compute[8304]: DEBUG nova.compute.manager [None req-6e5c11f8-cb5c-4e0d-82f3-4e72d18c8ee3 admin admin] Triggering sync for uuid 5e824732-c557-434f-9902-a6d9abce7bda {{(pid=8304) _sync_power_states /opt/stack/nova/nova/compute/manager.py:7028}} Jan 26 10:12:46 localhost.localdomain nova-compute[8304]: DEBUG nova.compute.manager [None req-6e5c11f8-cb5c-4e0d-82f3-4e72d18c8ee3 admin admin] Triggering sync for uuid a9bfe5ee-58f7-4e35-a59b-05a06857438d {{(pid=8304) _sync_power_states /opt/stack/nova/nova/compute/manager.py:7028}} Jan 26 10:12:46 localhost.localdomain nova-compute[8304]: DEBUG oslo_concurrency.lockutils [None req-6e5c11f8-cb5c-4e0d-82f3-4e72d18c8ee3 admin admin] Lock "5e824732-c557-434f-9902-a6d9abce7bda" acquired by "nova.compute.manager.query_driver_power_state_and_sync" :: waited 0.000s {{(pid=8304) inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:273}} Jan 26 10:12:46 localhost.localdomain nova-compute[8304]: DEBUG oslo_concurrency.lockutils [None req-6e5c11f8-cb5c-4e0d-82f3-4e72d18c8ee3 admin admin] Lock "a9bfe5ee-58f7-4e35-a59b-05a06857438d" acquired by "nova.compute.manager.query_driver_power_state_and_sync" :: waited 0.000s {{(pid=8304) inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:273}} Jan 26 10:12:46 localhost.localdomain nova-compute[8304]: DEBUG oslo_concurrency.lockutils [None req-6e5c11f8-cb5c-4e0d-82f3-4e72d18c8ee3 admin admin] Lock "5e824732-c557-434f-9902-a6d9abce7bda" released by "nova.compute.manager.query_driver_power_state_and_sync" :: held 0.157s {{(pid=8304) inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:285}} Jan 26 10:12:46 localhost.localdomain nova-compute[8304]: DEBUG oslo_concurrency.lockutils [None req-6e5c11f8-cb5c-4e0d-82f3-4e72d18c8ee3 admin admin] Lock "a9bfe5ee-58f7-4e35-a59b-05a06857438d" released by "nova.compute.manager.query_driver_power_state_and_sync" :: held 0.165s {{(pid=8304) inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:285}} Jan 26 10:12:50 localhost.localdomain nova-compute[8304]: DEBUG oslo_service.periodic_task [None req-6e5c11f8-cb5c-4e0d-82f3-4e72d18c8ee3 admin admin] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=8304) run_periodic_tasks /usr/lib/python2.7/site-packages/oslo_service/periodic_task.py:215}} Jan 26 10:12:52 localhost.localdomain nova-compute[8304]: DEBUG oslo_service.periodic_task [None req-6e5c11f8-cb5c-4e0d-82f3-4e72d18c8ee3 admin admin] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=8304) run_periodic_tasks /usr/lib/python2.7/site-packages/oslo_service/periodic_task.py:215}} Jan 26 10:12:52 localhost.localdomain nova-compute[8304]: DEBUG nova.compute.manager [None req-6e5c11f8-cb5c-4e0d-82f3-4e72d18c8ee3 admin admin] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=8304) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:7220}} Jan 26 10:12:55 localhost.localdomain nova-compute[8304]: DEBUG oslo_service.periodic_task [None req-6e5c11f8-cb5c-4e0d-82f3-4e72d18c8ee3 admin admin] Running periodic task ComputeManager._poll_volume_usage {{(pid=8304) run_periodic_tasks /usr/lib/python2.7/site-packages/oslo_service/periodic_task.py:215}} Jan 26 10:13:03 localhost.localdomain nova-compute[8304]: DEBUG oslo_service.periodic_task [None req-6e5c11f8-cb5c-4e0d-82f3-4e72d18c8ee3 admin admin] Running periodic task ComputeManager._poll_rescued_instances {{(pid=8304) run_periodic_tasks /usr/lib/python2.7/site-packages/oslo_service/periodic_task.py:215}} Jan 26 10:13:06 localhost.localdomain nova-compute[8304]: DEBUG oslo_service.periodic_task [None req-6e5c11f8-cb5c-4e0d-82f3-4e72d18c8ee3 admin admin] Running periodic task ComputeManager.update_available_resource {{(pid=8304) run_periodic_tasks /usr/lib/python2.7/site-packages/oslo_service/periodic_task.py:215}} Jan 26 10:13:06 localhost.localdomain nova-compute[8304]: DEBUG nova.compute.resource_tracker [None req-6e5c11f8-cb5c-4e0d-82f3-4e72d18c8ee3 admin admin] Auditing locally available compute resources for localhost.localdomain (node: localhost.localdomain) {{(pid=8304) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:663}} Jan 26 10:13:07 localhost.localdomain nova-compute[8304]: DEBUG oslo_concurrency.processutils [None req-6e5c11f8-cb5c-4e0d-82f3-4e72d18c8ee3 admin admin] Running cmd (subprocess): /usr/bin/python -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e824732-c557-434f-9902-a6d9abce7bda/disk {{(pid=8304) execute /usr/lib/python2.7/site-packages/oslo_concurrency/processutils.py:372}} Jan 26 10:13:07 localhost.localdomain nova-compute[8304]: DEBUG oslo_concurrency.processutils [None req-6e5c11f8-cb5c-4e0d-82f3-4e72d18c8ee3 admin admin] CMD "/usr/bin/python -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e824732-c557-434f-9902-a6d9abce7bda/disk" returned: 0 in 0.307s {{(pid=8304) execute /usr/lib/python2.7/site-packages/oslo_concurrency/processutils.py:409}} Jan 26 10:13:07 localhost.localdomain nova-compute[8304]: DEBUG oslo_concurrency.processutils [None req-6e5c11f8-cb5c-4e0d-82f3-4e72d18c8ee3 admin admin] Running cmd (subprocess): /usr/bin/python -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e824732-c557-434f-9902-a6d9abce7bda/disk {{(pid=8304) execute /usr/lib/python2.7/site-packages/oslo_concurrency/processutils.py:372}} Jan 26 10:13:07 localhost.localdomain nova-compute[8304]: DEBUG oslo_concurrency.processutils [None req-6e5c11f8-cb5c-4e0d-82f3-4e72d18c8ee3 admin admin] CMD "/usr/bin/python -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e824732-c557-434f-9902-a6d9abce7bda/disk" returned: 0 in 0.257s {{(pid=8304) execute /usr/lib/python2.7/site-packages/oslo_concurrency/processutils.py:409}} Jan 26 10:13:07 localhost.localdomain nova-compute[8304]: DEBUG oslo_concurrency.processutils [None req-6e5c11f8-cb5c-4e0d-82f3-4e72d18c8ee3 admin admin] Running cmd (subprocess): /usr/bin/python -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a9bfe5ee-58f7-4e35-a59b-05a06857438d/disk {{(pid=8304) execute /usr/lib/python2.7/site-packages/oslo_concurrency/processutils.py:372}} Jan 26 10:13:07 localhost.localdomain nova-compute[8304]: DEBUG oslo_concurrency.processutils [None req-6e5c11f8-cb5c-4e0d-82f3-4e72d18c8ee3 admin admin] CMD "/usr/bin/python -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a9bfe5ee-58f7-4e35-a59b-05a06857438d/disk" returned: 0 in 0.269s {{(pid=8304) execute /usr/lib/python2.7/site-packages/oslo_concurrency/processutils.py:409}} Jan 26 10:13:07 localhost.localdomain nova-compute[8304]: DEBUG oslo_concurrency.processutils [None req-6e5c11f8-cb5c-4e0d-82f3-4e72d18c8ee3 admin admin] Running cmd (subprocess): /usr/bin/python -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a9bfe5ee-58f7-4e35-a59b-05a06857438d/disk {{(pid=8304) execute /usr/lib/python2.7/site-packages/oslo_concurrency/processutils.py:372}} Jan 26 10:13:08 localhost.localdomain nova-compute[8304]: DEBUG oslo_concurrency.processutils [None req-6e5c11f8-cb5c-4e0d-82f3-4e72d18c8ee3 admin admin] CMD "/usr/bin/python -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a9bfe5ee-58f7-4e35-a59b-05a06857438d/disk" returned: 0 in 0.271s {{(pid=8304) execute /usr/lib/python2.7/site-packages/oslo_concurrency/processutils.py:409}} Jan 26 10:13:08 localhost.localdomain nova-compute[8304]: DEBUG nova.compute.resource_tracker [None req-6e5c11f8-cb5c-4e0d-82f3-4e72d18c8ee3 admin admin] Hypervisor/Node resource view: name=localhost.localdomain free_ram=2158MB free_disk=29GB free_vcpus=-1 pci_devices=[{"dev_id": "pci_0000_00_00_0", "product_id": "1237", "dev_type": "type-PCI", "numa_node": null, "vendor_id": "8086", "label": "label_8086_1237", "address": "0000:00:00.0"}, {"dev_id": "pci_0000_00_01_0", "product_id": "7000", "dev_type": "type-PCI", "numa_node": null, "vendor_id": "8086", "label": "label_8086_7000", "address": "0000:00:01.0"}, {"dev_id": "pci_0000_00_01_1", "product_id": "7010", "dev_type": "type-PCI", "numa_node": null, "vendor_id": "8086", "label": "label_8086_7010", "address": "0000:00:01.1"}, {"dev_id": "pci_0000_00_01_3", "product_id": "7113", "dev_type": "type-PCI", "numa_node": null, "vendor_id": "8086", "label": "label_8086_7113", "address": "0000:00:01.3"}, {"dev_id": "pci_0000_00_02_0", "product_id": "00b8", "dev_type": "type-PCI", "numa_node": null, "vendor_id": "1013", "label": "label_1013_00b8", "address": "0000:00:02.0"}, {"dev_id": "pci_0000_00_04_0", "product_id": "1003", "dev_type": "type-PCI", "numa_node": null, "vendor_id": "1af4", "label": "label_1af4_1003", "address": "0000:00:04.0"}, {"dev_id": "pci_0000_00_05_0", "product_id": "1004", "dev_type": "type-PCI", "numa_node": null, "vendor_id": "1af4", "label": "label_1af4_1004", "address": "0000:00:05.0"}, {"dev_id": "pci_0000_00_06_0", "product_id": "0001", "dev_type": "type-PCI", "numa_node": null, "vendor_id": "1b36", "label": "label_1b36_0001", "address": "0000:00:06.0"}, {"dev_id": "pci_0000_01_01_0", "product_id": "1001", "dev_type": "type-PCI", "numa_node": null, "vendor_id": "1af4", "label": "label_1af4_1001", "address": "0000:01:01.0"}, {"dev_id": "pci_0000_00_07_0", "product_id": "1002", "dev_type": "type-PCI", "numa_node": null, "vendor_id": "1af4", "label": "label_1af4_1002", "address": "0000:00:07.0"}, {"dev_id": "pci_0000_00_08_0", "product_id": "293a", "dev_ Jan 26 10:13:08 localhost.localdomain nova-compute[8304]: type": "type-PCI", "numa_node": null, "vendor_id": "8086", "label": "label_8086_293a", "address": "0000:00:08.0"}, {"dev_id": "pci_0000_00_09_0", "product_id": "003f", "dev_type": "type-PCI", "numa_node": null, "vendor_id": "106b", "label": "label_106b_003f", "address": "0000:00:09.0"}, {"dev_id": "pci_0000_00_10_0", "product_id": "1000", "dev_type": "type-PCI", "numa_node": null, "vendor_id": "1af4", "label": "label_1af4_1000", "address": "0000:00:10.0"}] {{(pid=8304) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:802}} Jan 26 10:13:08 localhost.localdomain nova-compute[8304]: DEBUG oslo_concurrency.lockutils [None req-6e5c11f8-cb5c-4e0d-82f3-4e72d18c8ee3 admin admin] Lock "compute_resources" acquired by "nova.compute.resource_tracker._update_available_resource" :: waited 0.000s {{(pid=8304) inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:273}} Jan 26 10:13:08 localhost.localdomain nova-compute[8304]: DEBUG nova.compute.resource_tracker [None req-6e5c11f8-cb5c-4e0d-82f3-4e72d18c8ee3 admin admin] Compute driver doesn't require allocation refresh and we're on a compute host in a deployment that only has compute hosts with Nova versions >=16 (Pike). Skipping auto-correction of allocations. {{(pid=8304) _update_usage_from_instances /opt/stack/nova/nova/compute/resource_tracker.py:1205}} Jan 26 10:13:08 localhost.localdomain nova-compute[8304]: DEBUG nova.compute.resource_tracker [None req-6e5c11f8-cb5c-4e0d-82f3-4e72d18c8ee3 admin admin] Instance 5e824732-c557-434f-9902-a6d9abce7bda actively managed on this compute host and has allocations in placement: {u'resources': {u'VCPU': 1, u'MEMORY_MB': 512, u'DISK_GB': 1}}. {{(pid=8304) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1227}} Jan 26 10:13:08 localhost.localdomain nova-compute[8304]: DEBUG nova.compute.resource_tracker [None req-6e5c11f8-cb5c-4e0d-82f3-4e72d18c8ee3 admin admin] Instance a9bfe5ee-58f7-4e35-a59b-05a06857438d actively managed on this compute host and has allocations in placement: {u'resources': {u'VCPU': 1, u'MEMORY_MB': 512, u'DISK_GB': 1}}. {{(pid=8304) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1227}} Jan 26 10:13:08 localhost.localdomain nova-compute[8304]: DEBUG nova.compute.resource_tracker [None req-6e5c11f8-cb5c-4e0d-82f3-4e72d18c8ee3 admin admin] Total usable vcpus: 1, total allocated vcpus: 2 {{(pid=8304) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:818}} Jan 26 10:13:08 localhost.localdomain nova-compute[8304]: INFO nova.compute.resource_tracker [None req-6e5c11f8-cb5c-4e0d-82f3-4e72d18c8ee3 admin admin] Final resource view: name=localhost.localdomain phys_ram=8191MB used_ram=1536MB phys_disk=35GB used_disk=2GB total_vcpus=1 used_vcpus=2 pci_stats=[] Jan 26 10:13:08 localhost.localdomain nova-compute[8304]: DEBUG nova.compute.resource_tracker [None req-6e5c11f8-cb5c-4e0d-82f3-4e72d18c8ee3 admin admin] Compute_service record updated for localhost.localdomain:localhost.localdomain {{(pid=8304) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:758}} Jan 26 10:13:08 localhost.localdomain nova-compute[8304]: DEBUG oslo_concurrency.lockutils [None req-6e5c11f8-cb5c-4e0d-82f3-4e72d18c8ee3 admin admin] Lock "compute_resources" released by "nova.compute.resource_tracker._update_available_resource" :: held 0.598s {{(pid=8304) inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:285}} Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: DEBUG oslo_concurrency.lockutils [None req-54849f29-2fd5-4ece-976b-bf4b2c5fd169 admin admin] Lock "5e824732-c557-434f-9902-a6d9abce7bda" acquired by "nova.compute.manager.do_shelve_instance" :: waited 0.000s {{(pid=8304) inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:273}} Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: INFO nova.compute.manager [None req-54849f29-2fd5-4ece-976b-bf4b2c5fd169 admin admin] [instance: 5e824732-c557-434f-9902-a6d9abce7bda] Shelving Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: DEBUG nova.objects.instance [None req-54849f29-2fd5-4ece-976b-bf4b2c5fd169 admin admin] Lazy-loading 'flavor' on Instance uuid 5e824732-c557-434f-9902-a6d9abce7bda {{(pid=8304) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1056}} Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: DEBUG nova.notifications.objects.base [None req-54849f29-2fd5-4ece-976b-bf4b2c5fd169 admin admin] Defaulting the value of the field 'projects' to None in FlavorPayload due to 'Cannot call _load_projects on orphaned Flavor object' {{(pid=8304) populate_schema /opt/stack/nova/nova/notifications/objects/base.py:130}} Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: DEBUG nova.virt.libvirt.driver [None req-54849f29-2fd5-4ece-976b-bf4b2c5fd169 admin admin] [instance: 5e824732-c557-434f-9902-a6d9abce7bda] Shutting down instance from state 3 {{(pid=8304) _clean_shutdown /opt/stack/nova/nova/virt/libvirt/driver.py:2638}} Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: DEBUG oslo_concurrency.lockutils [None req-54849f29-2fd5-4ece-976b-bf4b2c5fd169 admin admin] Lock "5e824732-c557-434f-9902-a6d9abce7bda" released by "nova.compute.manager.do_shelve_instance" :: held 0.199s {{(pid=8304) inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:285}} Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: DEBUG oslo_concurrency.lockutils [None req-54849f29-2fd5-4ece-976b-bf4b2c5fd169 admin admin] Lock "compute_resources" acquired by "nova.compute.resource_tracker.update_usage" :: waited 0.000s {{(pid=8304) inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:273}} Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: DEBUG oslo_concurrency.lockutils [None req-54849f29-2fd5-4ece-976b-bf4b2c5fd169 admin admin] Lock "compute_resources" released by "nova.compute.resource_tracker.update_usage" :: held 0.038s {{(pid=8304) inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:285}} Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: INFO nova.compute.manager [None req-54849f29-2fd5-4ece-976b-bf4b2c5fd169 admin admin] [instance: 5e824732-c557-434f-9902-a6d9abce7bda] Successfully reverted task state from shelving on failure for instance. Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server [None req-54849f29-2fd5-4ece-976b-bf4b2c5fd169 admin admin] Exception during message handling: libvirtError: Requested operation is not valid: domain is not running Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/server.py", line 163, in _process_incoming Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 220, in dispatch Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 190, in _do_dispatch Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 76, in wrapped Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server function_name, call_dict, binary) Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 220, in __exit__ Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server self.force_reraise() Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 196, in force_reraise Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server six.reraise(self.type_, self.value, self.tb) Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 67, in wrapped Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 186, in decorated_function Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server "Error: %s", e, instance=instance) Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 220, in __exit__ Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server self.force_reraise() Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 196, in force_reraise Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server six.reraise(self.type_, self.value, self.tb) Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 156, in decorated_function Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 944, in decorated_function Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 214, in decorated_function Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server kwargs['instance'], e, sys.exc_info()) Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 220, in __exit__ Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server self.force_reraise() Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 196, in force_reraise Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server six.reraise(self.type_, self.value, self.tb) Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 202, in decorated_function Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 4678, in shelve_instance Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server do_shelve_instance() Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py", line 274, in inner Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 4677, in do_shelve_instance Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server self._shelve_instance(context, instance, image_id, clean_shutdown) Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 4711, in _shelve_instance Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server self._power_off_instance(context, instance, clean_shutdown) Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2318, in _power_off_instance Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server self.driver.power_off(instance, timeout, retry_interval) Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/virt/libvirt/driver.py", line 2684, in power_off Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server self._clean_shutdown(instance, timeout, retry_interval) Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/virt/libvirt/driver.py", line 2639, in _clean_shutdown Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server guest.shutdown() Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/virt/libvirt/guest.py", line 589, in shutdown Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server self._domain.shutdown() Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/eventlet/tpool.py", line 186, in doit Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server result = proxy_call(self._autowrap, f, *args, **kwargs) Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/eventlet/tpool.py", line 144, in proxy_call Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server rv = execute(f, *args, **kwargs) Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/eventlet/tpool.py", line 125, in execute Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server six.reraise(c, e, tb) Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/eventlet/tpool.py", line 83, in tworker Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server rv = meth(*args, **kwargs) Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/usr/lib64/python2.7/site-packages/libvirt.py", line 2399, in shutdown Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server if ret == -1: raise libvirtError ('virDomainShutdown() failed', dom=self) Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server libvirtError: Requested operation is not valid: domain is not running Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server Jan 26 10:13:16 localhost.localdomain nova-compute[8304]: DEBUG oslo_service.periodic_task [None req-54849f29-2fd5-4ece-976b-bf4b2c5fd169 admin admin] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=8304) run_periodic_tasks /usr/lib/python2.7/site-packages/oslo_service/periodic_task.py:215}} You can use the following steps to reproduce(This problem also exists in the Ocata version): 1.nova pause 5e824732-c557-434f-9902-a6d9abce7bda 2.nova list +--------------------------------------+-------+--------+------------+-------------+---------------------+ | ID | Name | Status | Task State | Power State | Networks | +--------------------------------------+-------+--------+------------+-------------+---------------------+ | 5e824732-c557-434f-9902-a6d9abce7bda | test | PAUSED | - | Paused | network=192.168.0.9 | +--------------------------------------+-------+--------+------------+-------------+---------------------+ 3.nova shelve 5e824732-c557-434f-9902-a6d9abce7bda Error log is as follows: oslo_concurrency.lockutils [None req-54849f29-2fd5-4ece-976b-bf4b2c5fd169 admin admin] Lock "compute_resources" acquired by "nova.compute.resource_tracker.update_usage" :: waited 0.000s {{(pid=8304) inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:273}} Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: DEBUG oslo_concurrency.lockutils [None req-54849f29-2fd5-4ece-976b-bf4b2c5fd169 admin admin] Lock "compute_resources" released by "nova.compute.resource_tracker.update_usage" :: held 0.038s {{(pid=8304) inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:285}} Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: INFO nova.compute.manager [None req-54849f29-2fd5-4ece-976b-bf4b2c5fd169 admin admin] [instance: 5e824732-c557-434f-9902-a6d9abce7bda] Successfully reverted task state from shelving on failure for instance. Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server [None req-54849f29-2fd5-4ece-976b-bf4b2c5fd169 admin admin] Exception during message handling: libvirtError: Requested operation is not valid: domain is not running Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/server.py", line 163, in _process_incoming Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 220, in dispatch Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 190, in _do_dispatch Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 76, in wrapped Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server function_name, call_dict, binary) Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 220, in __exit__ Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server self.force_reraise() Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 196, in force_reraise Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server six.reraise(self.type_, self.value, self.tb) Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 67, in wrapped Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 186, in decorated_function Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server "Error: %s", e, instance=instance) Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 220, in __exit__ Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server self.force_reraise() Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 196, in force_reraise Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server six.reraise(self.type_, self.value, self.tb) Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 156, in decorated_function Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 944, in decorated_function Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 214, in decorated_function Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server kwargs['instance'], e, sys.exc_info()) Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 220, in __exit__ Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server self.force_reraise() Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 196, in force_reraise Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server six.reraise(self.type_, self.value, self.tb) Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 202, in decorated_function Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 4678, in shelve_instance Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server do_shelve_instance() Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py", line 274, in inner Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 4677, in do_shelve_instance Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server self._shelve_instance(context, instance, image_id, clean_shutdown) Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 4711, in _shelve_instance Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server self._power_off_instance(context, instance, clean_shutdown) Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2318, in _power_off_instance Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server self.driver.power_off(instance, timeout, retry_interval) Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/virt/libvirt/driver.py", line 2684, in power_off Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server self._clean_shutdown(instance, timeout, retry_interval) Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/virt/libvirt/driver.py", line 2639, in _clean_shutdown Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server guest.shutdown() Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/virt/libvirt/guest.py", line 589, in shutdown Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server self._domain.shutdown() Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/eventlet/tpool.py", line 186, in doit Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server result = proxy_call(self._autowrap, f, *args, **kwargs) Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/eventlet/tpool.py", line 144, in proxy_call Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server rv = execute(f, *args, **kwargs) Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/eventlet/tpool.py", line 125, in execute Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server six.reraise(c, e, tb) Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/eventlet/tpool.py", line 83, in tworker Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server rv = meth(*args, **kwargs) Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server File "/usr/lib64/python2.7/site-packages/libvirt.py", line 2399, in shutdown Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server if ret == -1: raise libvirtError ('virDomainShutdown() failed', dom=self) Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server libvirtError: Requested operation is not valid: domain is not running Jan 26 10:13:10 localhost.localdomain nova-compute[8304]: ERROR oslo_messaging.rpc.server Jan 26 10:13:16 localhost.localdomain nova-compute[8304]: DEBUG oslo_service.periodic_task [None req-54849f29-2fd5-4ece-976b-bf4b2c5fd169 admin admin] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=8304) run_periodic_tasks /usr/lib/python2.7/site-packages/oslo_service/periodic_task.py:215}}
2018-01-26 02:31:15 Brin Zhang nova: assignee Brin Zhang (zhangbailin)
2018-01-26 02:37:16 zhiyuan su nova: status New Confirmed
2018-01-26 02:39:00 zhiyuan su tags shelve
2018-01-26 02:40:52 zhiyuan su tags shelve ocata
2018-01-26 02:43:31 Leopard Ma nova: assignee Brin Zhang (zhangbailin) Leopard Ma (leopardma)
2018-02-12 01:52:38 OpenStack Infra nova: status Confirmed In Progress
2018-03-27 14:14:46 Matt Riedemann tags ocata libvirt ocata shelve
2018-03-27 14:22:49 Matt Riedemann nova: importance Undecided Medium
2018-03-27 14:23:04 Matt Riedemann summary The pause status of the virtual machine to perform the shelve operation reported error Failed to shelve paused instance with libvirt
2018-03-27 14:23:25 Matt Riedemann summary Failed to shelve paused instance with libvirt Failed to shelve paused instance with libvirt: "libvirtError: Requested operation is not valid: domain is not running"
2018-03-27 14:23:33 Matt Riedemann nominated for series nova/queens
2018-03-27 14:23:33 Matt Riedemann bug task added nova/queens
2018-03-27 14:23:33 Matt Riedemann nominated for series nova/pike
2018-03-27 14:23:33 Matt Riedemann bug task added nova/pike
2018-03-27 14:23:41 Matt Riedemann nova/pike: status New Confirmed
2018-03-27 14:23:43 Matt Riedemann nova/queens: status New Confirmed
2018-03-27 14:23:45 Matt Riedemann nova/pike: importance Undecided Medium
2018-03-27 14:23:47 Matt Riedemann nova/queens: importance Undecided Medium
2018-05-01 16:28:56 OpenStack Infra nova: assignee Leopard Ma (leopardma) Matt Riedemann (mriedem)
2018-05-01 16:29:41 Matt Riedemann nova: assignee Matt Riedemann (mriedem) Leopard Ma (leopardma)
2018-05-01 16:29:57 Matt Riedemann nominated for series nova/ocata
2018-05-01 16:29:57 Matt Riedemann bug task added nova/ocata
2018-05-01 16:30:18 Matt Riedemann nova/ocata: status New Confirmed
2018-05-01 16:30:23 Matt Riedemann nova/ocata: importance Undecided Medium
2018-05-11 01:51:48 OpenStack Infra nova: status In Progress Fix Released
2018-05-20 17:24:13 OpenStack Infra nova/queens: status Confirmed In Progress
2018-05-20 17:24:13 OpenStack Infra nova/queens: assignee Ghanshyam Mann (ghanshyammann)
2018-05-20 17:45:23 OpenStack Infra nova/pike: status Confirmed In Progress
2018-05-20 17:45:23 OpenStack Infra nova/pike: assignee Ghanshyam Mann (ghanshyammann)
2018-05-20 17:53:22 OpenStack Infra nova/ocata: status Confirmed In Progress
2018-05-20 17:53:22 OpenStack Infra nova/ocata: assignee Ghanshyam Mann (ghanshyammann)
2018-05-29 17:37:35 OpenStack Infra nova/queens: status In Progress Fix Committed
2018-05-31 04:34:44 OpenStack Infra nova/pike: status In Progress Fix Committed
2018-05-31 15:26:55 OpenStack Infra nova/ocata: status In Progress Fix Committed