Unserializable message: ('#ERROR', ValueError('I/O operation on closed file',))

Bug #1643457 reported by Thomas Bechtold
10
This bug affects 2 people
Affects Status Importance Assigned to Milestone
oslo.rootwrap
In Progress
Medium
Rodolfo Alonso

Bug Description

When using oslo.rootwrap (version 5.1.1) in daemon mode together with Nova (Newton, version 14.0.3.dev19) and Xen (version 4.7.0) as hypervisor, the following error occurs:

2016-11-21 08:00:08.325 14283 DEBUG nova.virt.libvirt.driver [req-0e47710e-8533-460c-8300-6051170f030b 5d96b4795f5f45aaaab306ddf9af84bf 535899fd82774c03b9b76c1627b04ae5 - - -] [instance: 5cae0baf-84cb-416a-a4d6-55896e582218] Ensure instance console log exists: /var/lib/nova/instances/5cae0baf-84cb-416a-a4d6-55896e582218/console.log _ensure_console_log_for_instance /usr/lib/python2.7/site-packages/nova/virt/libvirt/driver.py:2850
2016-11-21 08:00:08.328 14283 DEBUG nova.virt.libvirt.driver [req-0e47710e-8533-460c-8300-6051170f030b 5d96b4795f5f45aaaab306ddf9af84bf 535899fd82774c03b9b76c1627b04ae5 - - -] [instance: 5cae0baf-84cb-416a-a4d6-55896e582218] Start _get_guest_xml network_info=[{"profile": {}, "ovs_interfaceid": "29cbb677-9d5a-4595-8342-78dce15dbf78", "preserve_on_delete": false, "network": {"bridge": "br-int", "subnets": [{"ips": [{"meta": {}, "version": 4, "type": "fixed", "floating_ips": [], "address": "192.168.123.8"}], "version": 4, "meta": {"dhcp_server": "192.168.123.2"}, "dns": [], "routes": [], "cidr": "192.168.123.0/24", "gateway": {"meta": {}, "version": 4, "type": "gateway", "address": "192.168.123.1"}}], "meta": {"injected": false, "tenant_id": "19808058eca14b13b2b1d47024e797df", "mtu": 8858}, "id": "af7dfcc3-2583-4bf1-8db8-ecc9082d2939", "label": "fixed"}, "devname": "tap29cbb677-9d", "vnic_type": "normal", "qbh_params": null, "meta": {}, "details": {"port_filter": true, "ovs_hybrid_plug": true}, "address": "fa:16:3e:b2:ed:f7", "active": false, "type": "ovs", "id": "29cbb677-9d5a-4595-8342-78dce15dbf78", "qbg_params": null}] disk_info={'disk_bus': 'xen', 'cdrom_bus': 'xen', 'mapping': {'disk': {'bus': 'xen', 'boot_index': '1', 'type': 'disk', 'dev': u'xvda'}, 'root': {'bus': 'xen', 'boot_index': '1', 'type': 'disk', 'dev': u'xvda'}}} image_meta=ImageMeta(checksum='48ee5e85802f289d169bccb0735f2a69',container_format='bare',created_at=2016-11-19T17:56:33Z,direct_url=<?>,disk_format='raw',id=2ed52473-66df-4328-9af6-2c3994c5f72a,min_disk=0,min_ram=0,name='manila-service-image',owner='535899fd82774c03b9b76c1627b04ae5',properties=ImageMetaProps,protected=<?>,size=2147483648,status='active',tags=<?>,updated_at=2016-11-19T17:56:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'swap': None, 'root_device_name': u'/dev/xvda', 'ephemerals': [], 'block_device_mapping': []} _get_guest_xml /usr/lib/python2.7/site-packages/nova/virt/libvirt/driver.py:4644
2016-11-21 08:00:08.329 14283 DEBUG nova.objects.instance [req-0e47710e-8533-460c-8300-6051170f030b 5d96b4795f5f45aaaab306ddf9af84bf 535899fd82774c03b9b76c1627b04ae5 - - -] Lazy-loading 'pci_devices' on Instance uuid 5cae0baf-84cb-416a-a4d6-55896e582218 obj_load_attr /usr/lib/python2.7/site-packages/nova/objects/instance.py:1013
2016-11-21 08:00:08.331 14283 DEBUG oslo_messaging._drivers.amqpdriver [req-0e47710e-8533-460c-8300-6051170f030b 5d96b4795f5f45aaaab306ddf9af84bf 535899fd82774c03b9b76c1627b04ae5 - - -] CALL msg_id: f829afe8f18745d1b0237d086687ef05 exchange 'nova' topic 'conductor' _send /usr/lib/python2.7/site-packages/oslo_messaging/_drivers/amqpdriver.py:448
2016-11-21 08:00:08.344 14283 DEBUG oslo_messaging._drivers.amqpdriver [-] received reply msg_id: f829afe8f18745d1b0237d086687ef05 __call__ /usr/lib/python2.7/site-packages/oslo_messaging/_drivers/amqpdriver.py:296
2016-11-21 08:00:08.356 14283 DEBUG nova.virt.libvirt.driver [req-0e47710e-8533-460c-8300-6051170f030b 5d96b4795f5f45aaaab306ddf9af84bf 535899fd82774c03b9b76c1627b04ae5 - - -] Path '/var/lib/nova/instances' supports direct I/O _supports_direct_io /usr/lib/python2.7/site-packages/nova/virt/libvirt/driver.py:2778
2016-11-21 08:00:08.359 14283 DEBUG oslo_concurrency.lockutils [req-0e47710e-8533-460c-8300-6051170f030b 5d96b4795f5f45aaaab306ddf9af84bf 535899fd82774c03b9b76c1627b04ae5 - - -] Lock "daemon-client-lock" acquired by "nova.utils._get_client" :: waited 0.000s inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:270
2016-11-21 08:00:08.360 14283 DEBUG oslo_concurrency.lockutils [req-0e47710e-8533-460c-8300-6051170f030b 5d96b4795f5f45aaaab306ddf9af84bf 535899fd82774c03b9b76c1627b04ae5 - - -] Lock "daemon-client-lock" released by "nova.utils._get_client" :: held 0.001s inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:282
2016-11-21 08:00:08.360 14283 INFO nova.utils [req-0e47710e-8533-460c-8300-6051170f030b 5d96b4795f5f45aaaab306ddf9af84bf 535899fd82774c03b9b76c1627b04ae5 - - -] Executing RootwrapDaemonHelper.execute cmd=[u'xend status'] kwargs=[{'run_as_root': True}]
2016-11-21 08:00:08.361 14283 DEBUG nova.utils [req-0e47710e-8533-460c-8300-6051170f030b 5d96b4795f5f45aaaab306ddf9af84bf 535899fd82774c03b9b76c1627b04ae5 - - -] Running cmd (subprocess): xend status execute /usr/lib/python2.7/site-packages/nova/utils.py:240
2016-11-21 08:00:08.365 14283 ERROR nova.compute.manager [req-0e47710e-8533-460c-8300-6051170f030b 5d96b4795f5f45aaaab306ddf9af84bf 535899fd82774c03b9b76c1627b04ae5 - - -] [instance: 5cae0baf-84cb-416a-a4d6-55896e582218] Instance failed to spawn
2016-11-21 08:00:08.365 14283 ERROR nova.compute.manager [instance: 5cae0baf-84cb-416a-a4d6-55896e582218] Traceback (most recent call last):
2016-11-21 08:00:08.365 14283 ERROR nova.compute.manager [instance: 5cae0baf-84cb-416a-a4d6-55896e582218] File "/usr/lib/python2.7/site-packages/nova/compute/manager.py", line 2078, in _build_resources
2016-11-21 08:00:08.365 14283 ERROR nova.compute.manager [instance: 5cae0baf-84cb-416a-a4d6-55896e582218] yield resources
2016-11-21 08:00:08.365 14283 ERROR nova.compute.manager [instance: 5cae0baf-84cb-416a-a4d6-55896e582218] File "/usr/lib/python2.7/site-packages/nova/compute/manager.py", line 1920, in _build_and_run_instance
2016-11-21 08:00:08.365 14283 ERROR nova.compute.manager [instance: 5cae0baf-84cb-416a-a4d6-55896e582218] block_device_info=block_device_info)
2016-11-21 08:00:08.365 14283 ERROR nova.compute.manager [instance: 5cae0baf-84cb-416a-a4d6-55896e582218] File "/usr/lib/python2.7/site-packages/nova/virt/libvirt/driver.py", line 2595, in spawn
2016-11-21 08:00:08.365 14283 ERROR nova.compute.manager [instance: 5cae0baf-84cb-416a-a4d6-55896e582218] block_device_info=block_device_info)
2016-11-21 08:00:08.365 14283 ERROR nova.compute.manager [instance: 5cae0baf-84cb-416a-a4d6-55896e582218] File "/usr/lib/python2.7/site-packages/nova/virt/libvirt/driver.py", line 4647, in _get_guest_xml
2016-11-21 08:00:08.365 14283 ERROR nova.compute.manager [instance: 5cae0baf-84cb-416a-a4d6-55896e582218] context)
2016-11-21 08:00:08.365 14283 ERROR nova.compute.manager [instance: 5cae0baf-84cb-416a-a4d6-55896e582218] File "/usr/lib/python2.7/site-packages/nova/virt/libvirt/driver.py", line 4472, in _get_guest_config
2016-11-21 08:00:08.365 14283 ERROR nova.compute.manager [instance: 5cae0baf-84cb-416a-a4d6-55896e582218] flavor, guest.os_type)
2016-11-21 08:00:08.365 14283 ERROR nova.compute.manager [instance: 5cae0baf-84cb-416a-a4d6-55896e582218] File "/usr/lib/python2.7/site-packages/nova/virt/libvirt/driver.py", line 3465, in _get_guest_storage_config
2016-11-21 08:00:08.365 14283 ERROR nova.compute.manager [instance: 5cae0baf-84cb-416a-a4d6-55896e582218] inst_type)
2016-11-21 08:00:08.365 14283 ERROR nova.compute.manager [instance: 5cae0baf-84cb-416a-a4d6-55896e582218] File "/usr/lib/python2.7/site-packages/nova/virt/libvirt/driver.py", line 3410, in _get_guest_disk_config
2016-11-21 08:00:08.365 14283 ERROR nova.compute.manager [instance: 5cae0baf-84cb-416a-a4d6-55896e582218] self._host.get_version())
2016-11-21 08:00:08.365 14283 ERROR nova.compute.manager [instance: 5cae0baf-84cb-416a-a4d6-55896e582218] File "/usr/lib/python2.7/site-packages/nova/virt/libvirt/imagebackend.py", line 142, in libvirt_info
2016-11-21 08:00:08.365 14283 ERROR nova.compute.manager [instance: 5cae0baf-84cb-416a-a4d6-55896e582218] self.is_block_dev)
2016-11-21 08:00:08.365 14283 ERROR nova.compute.manager [instance: 5cae0baf-84cb-416a-a4d6-55896e582218] File "/usr/lib/python2.7/site-packages/nova/virt/libvirt/utils.py", line 121, in pick_disk_driver_name
2016-11-21 08:00:08.365 14283 ERROR nova.compute.manager [instance: 5cae0baf-84cb-416a-a4d6-55896e582218] run_as_root=True, check_exit_code=True)
2016-11-21 08:00:08.365 14283 ERROR nova.compute.manager [instance: 5cae0baf-84cb-416a-a4d6-55896e582218] File "/usr/lib/python2.7/site-packages/nova/virt/libvirt/utils.py", line 47, in execute
2016-11-21 08:00:08.365 14283 ERROR nova.compute.manager [instance: 5cae0baf-84cb-416a-a4d6-55896e582218] return utils.execute(*args, **kwargs)
2016-11-21 08:00:08.365 14283 ERROR nova.compute.manager [instance: 5cae0baf-84cb-416a-a4d6-55896e582218] File "/usr/lib/python2.7/site-packages/nova/utils.py", line 293, in execute
2016-11-21 08:00:08.365 14283 ERROR nova.compute.manager [instance: 5cae0baf-84cb-416a-a4d6-55896e582218] *cmd, **kwargs)
2016-11-21 08:00:08.365 14283 ERROR nova.compute.manager [instance: 5cae0baf-84cb-416a-a4d6-55896e582218] File "/usr/lib/python2.7/site-packages/nova/utils.py", line 243, in execute
2016-11-21 08:00:08.365 14283 ERROR nova.compute.manager [instance: 5cae0baf-84cb-416a-a4d6-55896e582218] cmd, process_input)
2016-11-21 08:00:08.365 14283 ERROR nova.compute.manager [instance: 5cae0baf-84cb-416a-a4d6-55896e582218] File "/usr/lib/python2.7/site-packages/oslo_rootwrap/client.py", line 128, in execute
2016-11-21 08:00:08.365 14283 ERROR nova.compute.manager [instance: 5cae0baf-84cb-416a-a4d6-55896e582218] res = proxy.run_one_command(cmd, stdin)
2016-11-21 08:00:08.365 14283 ERROR nova.compute.manager [instance: 5cae0baf-84cb-416a-a4d6-55896e582218] File "<string>", line 2, in run_one_command
2016-11-21 08:00:08.365 14283 ERROR nova.compute.manager [instance: 5cae0baf-84cb-416a-a4d6-55896e582218] File "/usr/lib64/python2.7/multiprocessing/managers.py", line 774, in _callmethod
2016-11-21 08:00:08.365 14283 ERROR nova.compute.manager [instance: 5cae0baf-84cb-416a-a4d6-55896e582218] raise convert_to_error(kind, result)
2016-11-21 08:00:08.365 14283 ERROR nova.compute.manager [instance: 5cae0baf-84cb-416a-a4d6-55896e582218] RemoteError:
2016-11-21 08:00:08.365 14283 ERROR nova.compute.manager [instance: 5cae0baf-84cb-416a-a4d6-55896e582218] ---------------------------------------------------------------------------
2016-11-21 08:00:08.365 14283 ERROR nova.compute.manager [instance: 5cae0baf-84cb-416a-a4d6-55896e582218] Unserializable message: ('#ERROR', ValueError('I/O operation on closed file',))
2016-11-21 08:00:08.365 14283 ERROR nova.compute.manager [instance: 5cae0baf-84cb-416a-a4d6-55896e582218] ---------------------------------------------------------------------------
2016-11-21 08:00:08.365 14283 ERROR nova.compute.manager [instance: 5cae0baf-84cb-416a-a4d6-55896e582218]
2016-11-21 08:00:08.379 14283 INFO nova.compute.manager [req-0e47710e-8533-460c-8300-6051170f030b 5d96b4795f5f45aaaab306ddf9af84bf 535899fd82774c03b9b76c1627b04ae5 - - -] [instance: 5cae0baf-84cb-416a-a4d6-55896e582218] Terminating instance
2016-11-21 08:00:08.382 14283 DEBUG oslo_messaging._drivers.amqpdriver [req-0e47710e-8533-460c-8300-6051170f030b 5d96b4795f5f45aaaab306ddf9af84bf 535899fd82774c03b9b76c1627b04ae5 - - -] CAST unique_id: 87d6d51845f34c1da73508c59faa229c NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python2.7/site-packages/oslo_messaging/_drivers/amqpdriver.py:432
2016-11-21 08:00:08.385 14283 DEBUG nova.compute.manager [req-0e47710e-8533-460c-8300-6051170f030b 5d96b4795f5f45aaaab306ddf9af84bf 535899fd82774c03b9b76c1627b04ae5 - - -] [instance: 5cae0baf-84cb-416a-a4d6-55896e582218] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python2.7/site-packages/nova/compute/manager.py:2192
2016-11-21 08:00:08.389 14283 INFO nova.virt.libvirt.driver [-] [instance: 5cae0baf-84cb-416a-a4d6-55896e582218] During wait destroy, instance disappeared.
2016-11-21 08:00:08.390 14283 DEBUG oslo_concurrency.processutils [req-0e47710e-8533-460c-8300-6051170f030b 5d96b4795f5f45aaaab306ddf9af84bf 535899fd82774c03b9b76c1627b04ae5 - - -] Running cmd (subprocess): mv /var/lib/nova/instances/5cae0baf-84cb-416a-a4d6-55896e582218 /var/lib/nova/instances/5cae0baf-84cb-416a-a4d6-55896e582218_del execute /usr/lib/python2.7/site-packages/oslo_concurrency/processutils.py:344
2016-11-21 08:00:08.496 14283 DEBUG oslo_concurrency.processutils [req-0e47710e-8533-460c-8300-6051170f030b 5d96b4795f5f45aaaab306ddf9af84bf 535899fd82774c03b9b76c1627b04ae5 - - -] CMD "mv /var/lib/nova/instances/5cae0baf-84cb-416a-a4d6-55896e582218 /var/lib/nova/instances/5cae0baf-84cb-416a-a4d6-55896e582218_del" returned: 0 in 0.105s execute /usr/lib/python2.7/site-packages/oslo_concurrency/processutils.py:374
2016-11-21 08:00:08.502 14283 INFO nova.virt.libvirt.driver [req-0e47710e-8533-460c-8300-6051170f030b 5d96b4795f5f45aaaab306ddf9af84bf 535899fd82774c03b9b76c1627b04ae5 - - -] [instance: 5cae0baf-84cb-416a-a4d6-55896e582218] Deleting instance files /var/lib/nova/instances/5cae0baf-84cb-416a-a4d6-55896e582218_del
2016-11-21 08:00:08.724 14283 INFO nova.virt.libvirt.driver [req-0e47710e-8533-460c-8300-6051170f030b 5d96b4795f5f45aaaab306ddf9af84bf 535899fd82774c03b9b76c1627b04ae5 - - -] [instance: 5cae0baf-84cb-416a-a4d6-55896e582218] Deletion of /var/lib/nova/instances/5cae0baf-84cb-416a-a4d6-55896e582218_del complete
2016-11-21 08:00:08.733 14283 DEBUG oslo_messaging._drivers.amqpdriver [req-0e47710e-8533-460c-8300-6051170f030b 5d96b4795f5f45aaaab306ddf9af84bf 535899fd82774c03b9b76c1627b04ae5 - - -] CALL msg_id: fa6d13d0cf604ff2ba83f23ee5895218 exchange 'nova' topic 'conductor' _send /usr/lib/python2.7/site-packages/oslo_messaging/_drivers/amqpdriver.py:448
2016-11-21 08:00:08.898 14283 DEBUG oslo_messaging._drivers.amqpdriver [-] received reply msg_id: fa6d13d0cf604ff2ba83f23ee5895218 __call__ /usr/lib/python2.7/site-packages/oslo_messaging/_drivers/amqpdriver.py:296

Revision history for this message
Ihar Hrachyshka (ihar-hrachyshka) wrote :

I just hit a similar traceback in neutron, and the reason was that the executable that was executed was missing on the system. Maybe you have a similar problem, and hence it may be not a Nova issue.

Revision history for this message
Ben Nemec (bnemec) wrote :

If we can confirm that this was a missing file problem, then this bug would mostly be around improving the error reporting so it's more clear what happened. Can somebody verify that this is the case? Thanks.

Changed in oslo.rootwrap:
status: New → Confirmed
importance: Undecided → Medium
Revision history for this message
Ben Nemec (bnemec) wrote :

Actually I guess Ihar's comment is sufficient confirmation that it is a problem. If this turns out not to have been the original bug then we can split this to track both, but until then we'll track it in this bug.

Revision history for this message
OpenStack Infra (hudson-openstack) wrote : Fix proposed to oslo.rootwrap (master)

Fix proposed to branch: master
Review: https://review.opendev.org/687554

Changed in oslo.rootwrap:
assignee: nobody → Rodolfo Alonso (rodolfo-alonso-hernandez)
status: Confirmed → In Progress
Revision history for this message
OpenStack Infra (hudson-openstack) wrote : Change abandoned on oslo.rootwrap (master)

Change abandoned by Rodolfo Alonso Hernandez (<email address hidden>) on branch: master
Review: https://review.opendev.org/687554
Reason: As commented by Herve and Thierry, I need to review rootwrap code before proposing this patch. Let's go for it.

To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.