Hello, I am still encountering this issue: The action raised an exception [action_ex_id=bc158ba8-ea5e-42d0-9712-30d24698a364, action_cls='', attributes='{}', params='{u'container_config': u'overcloud-config', u'container': u'overcloud'}'] u'23f1abe0-aa3f-46c3-a712-d3216b9b297d' [stack@undercloud (stackrc) ~]$ mistral execution-get dc662a9e-91e6-45a2-a033-0300e7bdc386 +--------------------+---------------------------------------+ | Field | Value | +--------------------+---------------------------------------+ | ID | dc662a9e-91e6-45a2-a033-0300e7bdc386 | | Workflow ID | f3401518-86d3-4c8a-aad7-0507362bcf97 | | Workflow name | tripleo.messaging.v1.send | | Workflow namespace | | | Description | sub-workflow execution | | Task Execution ID | 9e060c2a-f011-40ff-bab2-18de8a85fa1d | | Root Execution ID | bd5a619c-4ba4-495e-938d-f238f027e250 | | State | ERROR | | State info | Workflow failed due to message status | | Created at | 2018-11-21 18:47:35 | | Updated at | 2018-11-21 18:47:39 | +--------------------+---------------------------------------+ [stack@undercloud (stackrc) ~]$ mistral task-get 9e060c2a-f011-40ff-bab2-18de8a85fa1d +-----------------------+----------------------------------------------+ | Field | Value | +-----------------------+----------------------------------------------+ | ID | 9e060c2a-f011-40ff-bab2-18de8a85fa1d | | Name | send_message | | Workflow name | tripleo.deployment.v1.config_download_deploy | | Workflow namespace | | | Workflow Execution ID | bd5a619c-4ba4-495e-938d-f238f027e250 | | State | ERROR | | State info | Workflow failed due to message status | | Created at | 2018-11-21 18:47:35 | | Updated at | 2018-11-21 18:47:39 | +-----------------------+----------------------------------------------+ [stack@undercloud (stackrc) ~]$ mistral execution-get bd5a619c-4ba4-495e-938d-f238f027e250 +--------------------+-----------------------------------------------------------------------------------------------------------+ | Field | Value | +--------------------+-----------------------------------------------------------------------------------------------------------+ | ID | bd5a619c-4ba4-495e-938d-f238f027e250 | | Workflow ID | 29d5cd71-4b32-4dfc-9710-38a213b2e8e6 | | Workflow name | tripleo.deployment.v1.config_download_deploy | | Workflow namespace | | | Description | | | Task Execution ID | | | Root Execution ID | | | State | ERROR | | State info | Failure caused by error in tasks: send_message | | | | | | send_message [task_ex_id=9e060c2a-f011-40ff-bab2-18de8a85fa1d] -> Workflow failed due to message status | | | [wf_ex_id=dc662a9e-91e6-45a2-a033-0300e7bdc386, idx=0]: Workflow failed due to message status | | | | | Created at | 2018-11-21 18:46:12 | | Updated at | 2018-11-21 18:47:41 | +--------------------+-----------------------------------------------------------------------------------------------------------+ Logs observed on undercloud: /var/log/container/engine.log: ./engine.log:2018-11-21 18:47:39.968 7 INFO workflow_trace [req-38c24156-db63-4fae-9f2d-cf389ad8b04c fc9c6785739f40c7bfa7c22621f4ddbf 3a971c7740294e00b8109a7d12b7f471 - default default] Task 'send_message' (9e060c2a-f011-40ff-bab2-18de8a85fa1d) [RUNNING -> ERROR, msg=Workflow failed due to message status] (execution_id=bd5a619c-4ba4-495e-938d-f238f027e250) ./engine.log:2018-11-21 18:47:41.068 7 INFO workflow_trace [req-525265a7-4cbd-48dc-8de6-6c707e2f25a9 - - - - -] Workflow 'tripleo.deployment.v1.config_download_deploy' [RUNNING -> ERROR, msg=Failure caused by error in tasks: send_message Most relevant log error Ive found so far: (the UUID is not the UUID of the scaled-out compute – but this error occurs during scale-out at the same timestamp above) /var/log/container/executor.log: 2018-11-21 18:47:35.380 7 WARNING mistral.executors.default_executor [req-eb5348cd-bd3c-4ea6-9aac-5653a0942c0d fc9c6785739f40c7bfa7c22621f4ddbf 3a971c7740294e00b8109a7d12b7f471 - default default] The action raised an exception [action_ex_id=bc158ba8-ea5e-42d0-9712-30d24698a364, action_cls='', attributes='{}', params='{u'container_config': u'overcloud-config', u'container': u'overcloud'}'] u'23f1abe0-aa3f-46c3-a712-d3216b9b297d': KeyError: u'23f1abe0-aa3f-46c3-a712-d3216b9b297d' 2018-11-21 18:47:35.380 7 ERROR mistral.executors.default_executor Traceback (most recent call last): 2018-11-21 18:47:35.380 7 ERROR mistral.executors.default_executor File "/usr/lib/python2.7/site-packages/mistral/executors/default_executor.py", line 114, in run_action 2018-11-21 18:47:35.380 7 ERROR mistral.executors.default_executor result = action.run(action_ctx) 2018-11-21 18:47:35.380 7 ERROR mistral.executors.default_executor File "/usr/lib/python2.7/site-packages/tripleo_common/actions/config.py", line 76, in run 2018-11-21 18:47:35.380 7 ERROR mistral.executors.default_executor commit_message=message) 2018-11-21 18:47:35.380 7 ERROR mistral.executors.default_executor File "/usr/lib/python2.7/site-packages/tripleo_common/utils/config.py", line 424, in download_config 2018-11-21 18:47:35.380 7 ERROR mistral.executors.default_executor self.write_config(stack, name, config_dir, config_type) 2018-11-21 18:47:35.380 7 ERROR mistral.executors.default_executor File "/usr/lib/python2.7/site-packages/tripleo_common/utils/config.py", line 298, in write_config 2018-11-21 18:47:35.380 7 ERROR mistral.executors.default_executor server_names[server_id], 2018-11-21 18:47:35.380 7 ERROR mistral.executors.default_executor KeyError: u'23f1abe0-aa3f-46c3-a712-d3216b9b297d' This code DOES exist in in our codebase – however this looks like the same exact issue. Looking at the blacklisted servers during scale out : [stack@undercloud (stackrc) mistral]$ cat /home/stack/templates/server-blacklist.yaml parameter_defaults: DeploymentServerBlacklist: - overcloud-ovscompute-1 - overcloud-ovscompute-0 Comparing the Keyerror uuid above to nova list: Belongs to overcloud-ovscompute-0 23f1abe0-aa3f-46c3-a712-d3216b9b297d | overcloud-ovscompute-0