The patch also applies cleanly to stable/liberty. Logs with the patch applied: 2015-11-18 03:54:36.294 INFO ironic.conductor.manager [-] Executing cleaning on node 795a5722-22ad-4537-b17c-fcea5f7dc125, remaining steps: [{u'priority': 20, u'interface': u'deploy', u'step': u'test_devstack', u'abortable': True, u'reboot_requested': False}, {u'priority': 10, u'interface': u'deploy', u'step': u'erase_devices', u'abortable': True, u'reboot_requested': False}] 2015-11-18 03:54:36.300 INFO ironic.conductor.manager [-] Executing {u'priority': 20, u'interface': u'deploy', u'step': u'test_devstack', u'abortable': True, u'reboot_requested': False} on node 795a5722-22ad-4537-b17c-fcea5f7dc125 2015-11-18 03:54:36.304 DEBUG ironic.drivers.modules.agent_client [-] Executing agent command clean.execute_clean_step for node 795a5722-22ad-4537-b17c-fcea5f7dc125 from (pid=17795) _command /opt/stack/ironic/ironic/drivers/modules/agent_client.py:69 2015-11-18 03:54:36.447 DEBUG ironic.drivers.modules.agent_client [-] Agent command clean.execute_clean_step for node 795a5722-22ad-4537-b17c-fcea5f7dc125 returned result None, error None, HTTP status code 200 from (pid=17795) _command /opt/stack/ironic/ironic/drivers/modules/agent_client.py:93 2015-11-18 03:54:36.447 INFO ironic.conductor.manager [-] Clean step {u'priority': 20, u'interface': u'deploy', u'step': u'test_devstack', u'abortable': True, u'reboot_requested': False} on node 795a5722-22ad-4537-b17c-fcea5f7dc125 being executed asynchronously, waiting for driver. 2015-11-18 03:54:36.448 DEBUG ironic.common.states [-] Exiting old state 'cleaning' in response to event 'wait' from (pid=17795) on_exit /opt/stack/ironic/ironic/common/states.py:199 2015-11-18 03:54:36.448 DEBUG ironic.common.states [-] Entering new state 'clean wait' in response to event 'wait' from (pid=17795) on_enter /opt/stack/ironic/ironic/common/states.py:205 2015-11-18 03:54:36.460 DEBUG ironic.conductor.task_manager [-] Successfully released exclusive lock for node cleaning on node 795a5722-22ad-4537-b17c-fcea5f7dc125 (lock was held 0.18 sec) from (pid=17795) release_resources /opt/stack/ironic/ironic/conductor/task_manager.py:311 2015-11-18 03:54:36.501 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: d14e31a90b6d404da8ad5a066f0f1861 reply to reply_b48c2415999f4ac1845b4f36e08cb180 from (pid=17795) __call__ /usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py:193 2015-11-18 03:54:36.502 DEBUG ironic.conductor.manager [req-51e72e4c-69a7-4225-b21e-d8db3e85b65a None None] RPC vendor_passthru called for node 795a5722-22ad-4537-b17c-fcea5f7dc125. from (pid=17795) vendor_passthru /opt/stack/ironic/ironic/conductor/manager.py:491 2015-11-18 03:54:36.503 DEBUG ironic.conductor.task_manager [req-51e72e4c-69a7-4225-b21e-d8db3e85b65a None None] Attempting to get exclusive lock on node 795a5722-22ad-4537-b17c-fcea5f7dc125 (for calling vendor passthru) from (pid=17795) __init__ /opt/stack/ironic/ironic/conductor/task_manager.py:201 2015-11-18 03:54:36.512 DEBUG ironic.conductor.task_manager [req-51e72e4c-69a7-4225-b21e-d8db3e85b65a None None] Node 795a5722-22ad-4537-b17c-fcea5f7dc125 successfully reserved for calling vendor passthru (took 0.01 seconds) from (pid=17795) reserve_node /opt/stack/ironic/ironic/conductor/task_manager.py:239 2015-11-18 03:54:36.516 DEBUG oslo_concurrency.lockutils [req-51e72e4c-69a7-4225-b21e-d8db3e85b65a None None] Lock "conductor_worker_spawn" acquired by "ironic.conductor.manager._spawn_worker" :: waited 0.000s from (pid=17795) inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:253 2015-11-18 03:54:36.516 DEBUG oslo_concurrency.lockutils [req-51e72e4c-69a7-4225-b21e-d8db3e85b65a None None] Lock "conductor_worker_spawn" released by "ironic.conductor.manager._spawn_worker" :: held 0.000s from (pid=17795) inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:265 2015-11-18 03:54:36.517 DEBUG ironic.drivers.modules.agent_base_vendor [-] Heartbeat from 795a5722-22ad-4537-b17c-fcea5f7dc125, last heartbeat at 1447818876. from (pid=17795) heartbeat /opt/stack/ironic/ironic/drivers/modules/agent_base_vendor.py:321 2015-11-18 03:54:36.524 DEBUG oslo_messaging._drivers.amqpdriver [req-51e72e4c-69a7-4225-b21e-d8db3e85b65a None None] sending reply msg_id: d14e31a90b6d404da8ad5a066f0f1861 reply queue: reply_b48c2415999f4ac1845b4f36e08cb180 from (pid=17795) _send_reply /usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py:79 2015-11-18 03:54:36.531 DEBUG ironic.drivers.modules.agent_client [-] Fetching status of agent commands for node 795a5722-22ad-4537-b17c-fcea5f7dc125 from (pid=17795) get_commands_status /opt/stack/ironic/ironic/drivers/modules/agent_client.py:98 2015-11-18 03:54:36.644 DEBUG ironic.drivers.modules.agent_client [-] Status of agent commands for node 795a5722-22ad-4537-b17c-fcea5f7dc125: get_clean_steps: result "{u'clean_steps': {u'GenericHardwareManager': [{u'priority': 10, u'interface': u'deploy', u'step': u'erase_devices', u'abortable': True, u'reboot_requested': False}, {u'priority': 20, u'interface': u'deploy', u'step': u'test_devstack', u'abortable': True, u'reboot_requested': False}]}, u'hardware_manager_version': {u'generic_hardware_manager': u'1.0'}}", error "None"; execute_clean_step: result "{u'clean_step': {u'priority': 20, u'interface': u'deploy', u'step': u'test_devstack', u'abortable': True, u'reboot_requested': False}, u'clean_result': u'hi devstack!'}", error "None" from (pid=17795) get_commands_status /opt/stack/ironic/ironic/drivers/modules/agent_client.py:107 2015-11-18 03:54:36.645 DEBUG ironic.drivers.modules.agent_base_vendor [-] Cleaning command status for node 795a5722-22ad-4537-b17c-fcea5f7dc125 on step {u'priority': 20, u'interface': u'deploy', u'step': u'test_devstack', u'abortable': True, u'reboot_requested': False}: {u'command_error': None, u'command_status': u'SUCCEEDED', u'command_params': {u'node': {u'target_power_state': None, u'target_provision_state': u'available', u'last_error': None, u'updated_at': u'2015-11-18T03:54:36.000000', u'maintenance_reason': None, u'chassis_id': 1, u'provision_state': u'cleaning', u'clean_step': {u'priority': 20, u'interface': u'deploy', u'step': u'test_devstack', u'abortable': True, u'reboot_requested': False}, u'id': 2, u'uuid': u'795a5722-22ad-4537-b17c-fcea5f7dc125', u'console_enabled': False, u'extra': {}, u'raid_config': {}, u'provision_updated_at': u'2015-11-18T03:54:36.000000', u'maintenance': False, u'target_raid_config': {}, u'conductor_affinity': None, u'inspection_started_at': None, u'inspection_finished_at': None, u'power_state': u'power on', u'driver': u'agent_ssh', u'reservation': u'jim-devstack', u'properties': {u'memory_mb': 1024, u'cpu_arch': u'x86_64', u'local_gb': 10, u'cpus': 1}, u'instance_uuid': None, u'name': u'node-1', u'driver_info': {u'ssh_port': 22, u'ssh_username': u'stack', u'deploy_kernel': u'a5c4ce14-4d6e-4e7d-9235-c55ba845fd10', u'deploy_ramdisk': u'aa726045-c77c-436c-8a76-ec687daeceaa', u'ssh_virt_type': u'virsh', u'ssh_address': u'104.239.168.65', u'ssh_key_filename': u'/opt/stack/data/ironic/ssh_keys/ironic_key'}, u'created_at': u'2015-11-18T01:11:03.000000', u'driver_internal_info': {u'agent_url': u'http://10.1.0.7:9999', u'agent_erase_devices_iterations': 1, u'agent_last_heartbeat': 1447818876, u'clean_steps': [{u'priority': 20, u'interface': u'deploy', u'step': u'test_devstack', u'abortable': True, u'reboot_requested': False}, {u'priority': 10, u'interface': u'deploy', u'step': u'erase_devices', u'abortable': True, u'reboot_requested': False}], u'hardware_manager_version': {u'generic_hardware_manager': u'1.0'}}, u'instance_info': {u'deploy_key': u'8BE8XBQDSIL66CAEP6S47ATBJAZNYHSI'}}, u'step': {u'priority': 20, u'interface': u'deploy', u'step': u'test_devstack', u'abortable': True, u'reboot_requested': False}, u'ports': [{u'uuid': u'7949ab2d-301c-4af0-a7c5-8d0b66d265d1', u'extra': {u'vif_port_id': u'313ab317-f534-4bf2-b1d3-01ed9c29e048'}, u'created_at': u'2015-11-18T01:11:03.000000', u'updated_at': u'2015-11-18T03:49:43.000000', u'node_id': 2, u'address': u'52:54:00:52:7b:8a', u'id': 2}], u'clean_version': {u'generic_hardware_manager': u'1.0'}}, u'command_result': {u'clean_step': {u'priority': 20, u'interface': u'deploy', u'step': u'test_devstack', u'abortable': True, u'reboot_requested': False}, u'clean_result': u'hi devstack!'}, u'id': u'9dc0a4cf-3667-457d-8960-25f279c91a3e', u'command_name': u'execute_clean_step'} from (pid=17795) continue_cleaning /opt/stack/ironic/ironic/drivers/modules/agent_base_vendor.py:239 2015-11-18 03:54:36.645 INFO ironic.drivers.modules.agent_base_vendor [-] Agent on node 795a5722-22ad-4537-b17c-fcea5f7dc125 returned cleaning command success, moving to next clean step 2015-11-18 03:54:36.645 DEBUG ironic.drivers.modules.agent_base_vendor [-] Sending RPC to conductor to resume cleaning for node 795a5722-22ad-4537-b17c-fcea5f7dc125 from (pid=17795) _notify_conductor_resume_clean /opt/stack/ironic/ironic/drivers/modules/agent_base_vendor.py:216 2015-11-18 03:54:36.656 DEBUG ironic.conductor.task_manager [-] Successfully released exclusive lock for calling vendor passthru on node 795a5722-22ad-4537-b17c-fcea5f7dc125 (lock was held 0.14 sec) from (pid=17795) release_resources /opt/stack/ironic/ironic/conductor/task_manager.py:311 2015-11-18 03:54:36.657 DEBUG oslo_messaging._drivers.amqpdriver [-] CAST unique_id: cb6a8ee1bbfc4f3a87cc8576bf600e52 exchange 'ironic' topic 'ironic.conductor_manager.jim-devstack' from (pid=17795) _send /usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py:448 2015-11-18 03:54:36.659 DEBUG oslo_messaging._drivers.amqpdriver [-] received message unique_id: cb6a8ee1bbfc4f3a87cc8576bf600e52 from (pid=17795) __call__ /usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py:195 2015-11-18 03:54:36.659 DEBUG ironic.conductor.manager [req-51e72e4c-69a7-4225-b21e-d8db3e85b65a None None] RPC continue_node_clean called for node 795a5722-22ad-4537-b17c-fcea5f7dc125. from (pid=17795) continue_node_clean /opt/stack/ironic/ironic/conductor/manager.py:881 2015-11-18 03:54:36.660 DEBUG ironic.conductor.task_manager [req-51e72e4c-69a7-4225-b21e-d8db3e85b65a None None] Attempting to get exclusive lock on node 795a5722-22ad-4537-b17c-fcea5f7dc125 (for node cleaning) from (pid=17795) __init__ /opt/stack/ironic/ironic/conductor/task_manager.py:201 2015-11-18 03:54:36.667 DEBUG ironic.conductor.task_manager [req-51e72e4c-69a7-4225-b21e-d8db3e85b65a None None] Node 795a5722-22ad-4537-b17c-fcea5f7dc125 successfully reserved for node cleaning (took 0.01 seconds) from (pid=17795) reserve_node /opt/stack/ironic/ironic/conductor/task_manager.py:239 2015-11-18 03:54:36.671 DEBUG ironic.common.states [req-51e72e4c-69a7-4225-b21e-d8db3e85b65a None None] Exiting old state 'clean wait' in response to event 'resume' from (pid=17795) on_exit /opt/stack/ironic/ironic/common/states.py:199 2015-11-18 03:54:36.671 DEBUG ironic.common.states [req-51e72e4c-69a7-4225-b21e-d8db3e85b65a None None] Entering new state 'cleaning' in response to event 'resume' from (pid=17795) on_enter /opt/stack/ironic/ironic/common/states.py:205 2015-11-18 03:54:36.678 DEBUG oslo_concurrency.lockutils [req-51e72e4c-69a7-4225-b21e-d8db3e85b65a None None] Lock "conductor_worker_spawn" acquired by "ironic.conductor.manager._spawn_worker" :: waited 0.000s from (pid=17795) inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:253 2015-11-18 03:54:36.679 DEBUG oslo_concurrency.lockutils [req-51e72e4c-69a7-4225-b21e-d8db3e85b65a None None] Lock "conductor_worker_spawn" released by "ironic.conductor.manager._spawn_worker" :: held 0.000s from (pid=17795) inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:265 2015-11-18 03:54:36.679 INFO ironic.conductor.manager [-] Executing cleaning on node 795a5722-22ad-4537-b17c-fcea5f7dc125, remaining steps: [{u'priority': 10, u'interface': u'deploy', u'step': u'erase_devices', u'abortable': True, u'reboot_requested': False}] 2015-11-18 03:54:36.686 INFO ironic.conductor.manager [-] Executing {u'priority': 10, u'interface': u'deploy', u'step': u'erase_devices', u'abortable': True, u'reboot_requested': False} on node 795a5722-22ad-4537-b17c-fcea5f7dc125