TripleO Ironic- RuntimeError: Second simultaneous read on fileno n detected.

Bug #1308045 reported by Thom Leggett
16
This bug affects 3 people
Affects Status Importance Assigned to Milestone
Ironic
Fix Released
High
Unassigned

Bug Description

When building a TripleO devtest stack with USE_IRONIC=1 I often (but not always) see the undercloud fail to deploy a machine with the following error in ironic-conductor.log:

Traceback (most recent call last):
  File "/opt/stack/venvs/ironic/local/lib/python2.7/site-packages/eventlet/hubs/hub.py", line 346, in fire_timers
    timer()
  File "/opt/stack/venvs/ironic/local/lib/python2.7/site-packages/eventlet/hubs/timer.py", line 56, in __call__
    cb(*args, **kw)
  File "/opt/stack/venvs/ironic/local/lib/python2.7/site-packages/eventlet/greenthread.py", line 194, in main
    result = function(*args, **kwargs)
  File "/opt/stack/venvs/ironic/local/lib/python2.7/site-packages/ironic/conductor/manager.py", line 369, in _do_node_deploy
    node.target_provision_state = states.NOSTATE
  File "/opt/stack/venvs/ironic/local/lib/python2.7/site-packages/ironic/openstack/common/excutils.py", line 70, in __exit__
    six.reraise(self.type_, self.value, self.tb)
  File "/opt/stack/venvs/ironic/local/lib/python2.7/site-packages/ironic/conductor/manager.py", line 363, in _do_node_deploy
    task.driver.deploy.prepare(task, node)
  File "/opt/stack/venvs/ironic/local/lib/python2.7/site-packages/ironic/drivers/modules/pxe.py", line 610, in prepare
    _cache_images(node, pxe_info, task.context)
  File "/opt/stack/venvs/ironic/local/lib/python2.7/site-packages/ironic/drivers/modules/pxe.py", line 431, in _cache_images
    _cache_tftp_images(ctx, node, pxe_info)
  File "/opt/stack/venvs/ironic/local/lib/python2.7/site-packages/ironic/drivers/modules/pxe.py", line 357, in _cache_tftp_images
    _get_image(ctx, path, uuid, CONF.pxe.tftp_master_path, None)
  File "/opt/stack/venvs/ironic/local/lib/python2.7/site-packages/ironic/drivers/modules/pxe.py", line 337, in _get_image
    images.fetch_to_raw(ctx, uuid, tmp_path, image_service)
  File "/opt/stack/venvs/ironic/local/lib/python2.7/site-packages/ironic/common/images.py", line 193, in fetch_to_raw
    image_to_raw(image_href, path, path_tmp)
  File "/opt/stack/venvs/ironic/local/lib/python2.7/site-packages/ironic/common/images.py", line 198, in image_to_raw
    data = qemu_img_info(path_tmp)
  File "/opt/stack/venvs/ironic/local/lib/python2.7/site-packages/ironic/common/images.py", line 167, in qemu_img_info
    'qemu-img', 'info', path)
  File "/opt/stack/venvs/ironic/local/lib/python2.7/site-packages/ironic/common/utils.py", line 64, in execute
    result = processutils.execute(*cmd, **kwargs)
  File "/opt/stack/venvs/ironic/local/lib/python2.7/site-packages/ironic/openstack/common/processutils.py", line 163, in execute
    result = obj.communicate()
  File "/usr/lib/python2.7/subprocess.py", line 799, in communicate
    return self._communicate(input)
  File "/usr/lib/python2.7/subprocess.py", line 1403, in _communicate
    stdout, stderr = self._communicate_with_select(input)
  File "/usr/lib/python2.7/subprocess.py", line 1504, in _communicate_with_select
    rlist, wlist, xlist = select.select(read_set, write_set, [])
  File "/opt/stack/venvs/ironic/local/lib/python2.7/site-packages/eventlet/green/select.py", line 75, in select
    listeners.append(hub.add(hub.READ, k, on_read))
  File "/opt/stack/venvs/ironic/local/lib/python2.7/site-packages/eventlet/hubs/epolls.py", line 48, in add
    listener = BaseHub.add(self, evtype, fileno, cb)
  File "/opt/stack/venvs/ironic/local/lib/python2.7/site-packages/eventlet/hubs/hub.py", line 126, in add
    evtype, fileno, evtype))
RuntimeError: Second simultaneous read on fileno 9 detected. Unless you really know what you're doing, make sure that only one greenthread can read any particular socket. Consider using a pools.Pool. If you do know what you're doing and want to disable this error, call eventlet.debug.hub_prevent_multiple_readers(False)

Revision history for this message
Dmitry Tantsur (divius) wrote :

Hi! I believe that this error should be fixed by patch https://review.openstack.org/#/c/95213/
If it's not the case, please feel free to reopen this bug with providing an up-to-date traceback.

Changed in ironic:
status: New → Fix Committed
importance: Undecided → High
Thierry Carrez (ttx)
Changed in ironic:
milestone: none → juno-1
status: Fix Committed → Fix Released
Revision history for this message
Cian O'Driscoll (dricco) wrote :
Download full text (9.9 KiB)

I'm seeing this during an upgrade of the overcloud.

+--------------------------------------+-------------------------------------+---------+------------------+-------------+---------------------+
| ID | Name | Status | Task State | Power State | Networks |
+--------------------------------------+-------------------------------------+---------+------------------+-------------+---------------------+
| ed315f19-d1ec-4526-9ea1-cea2607e876c | overcloud-NovaCompute0-nzkmif6luqlq | ERROR | - | Running | ctlplane=192.0.2.25 |
| b33273ed-b0ea-4c64-949e-552834f7fe2d | overcloud-NovaCompute1-mv4pmsoa6uuq | REBUILD | rebuild_spawning | Running | ctlplane=192.0.2.24 |
| 7ce52498-2c4e-4762-b461-e92d4a84d2a6 | overcloud-controller0-qzine3rvwbck | REBUILD | rebuild_spawning | Running | ctlplane=192.0.2.26 |
+--------------------------------------+-------------------------------------+---------+------------------+-------------+---------------------+

| fault | {"message": "Second simultaneous read on fileno 14 detected. Unless you really know what you're doing, make sure that only one greenthread can read any particular socket. Consider using a pools.Pool. If you do know what you're doing and want to disable this error, c", "code": 500, "details": " File \"/opt/stack/venvs/nova/local/lib/python2.7/site-packages/nova/compute/manager.py\", line 307, in decorated_function |
| | return function(self, context, *args, **kwargs)

/openstack/common/processutils.py", line 186, in execute
2014-07-23 16:09:00.915 5551 TRACE oslo.messaging.rpc.dispatcher result = obj.communicate()
2014-07-23 16:09:00.915 5551 TRACE oslo.messaging.rpc.dispatcher File "/usr/lib/python2.7/subprocess.py", line 799, in communicate
2014-07-23 16:09:00.915 5551 TRACE oslo.messaging.rpc.dispatcher return self._communicate(input)
2014-07-23 16:09:00.915 5551 TRACE oslo.messaging.rpc.dispatcher File "/usr/lib/python2.7/subprocess.py", line 1403, in _communicate
2014-07-23 16:09:00.915 5551 TRACE oslo.messaging.rpc.dispatcher stdout, stderr = self._communicate_with_select(input)
2014-07-23 16:09:00.915 5551 TRACE oslo.messaging.rpc.dispatcher File "/usr/lib/python2.7/subprocess.py", line 1504, in _communicate_with_select
2014-07-23 16:09:00.915 5551 TRACE oslo.messaging.rpc.dispatcher rlist, wlist, xlist = select.select(read_set, write_set, [])
2014-07-23 16:09:00.915 5551 TRACE oslo.messaging.rpc.dispatcher File "/usr/lib/python2.7/dist-packages/eventlet/green/select.py", line 75, in select
2014-07-23 16:09:00.915 5551 TRACE oslo.messaging.rpc.dispatcher listeners.append(hub.add(hub.READ, k, on_read))
2014-07-23 16:09:00.915 5551 TRACE oslo.messaging.rpc.dispatcher File "/usr/lib/python2.7/dist-packages/eventlet/hubs/epolls.py", line 48, in add
2014-07-23 16:09:00.915 5551 TRACE oslo.messaging.rpc.dispatcher listener = BaseHub.add(self, evtype, fileno, cb)
2014-07-23 16:09:00.915 5551 TRACE oslo.messaging.rpc.dispatcher File "/usr/lib/python2.7/dist-packages/eventl...

Revision history for this message
Cian O'Driscoll (dricco) wrote :

We need the eventlet fix for this - see https://bugs.launchpad.net/ironic/+bug/1308045

Revision history for this message
Cian O'Driscoll (dricco) wrote :

Wrong link above, this is the correct one https://bugs.launchpad.net/ironic/+bug/1321787

Thierry Carrez (ttx)
Changed in ironic:
milestone: juno-1 → 2014.2
To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.