All ironic tripleo ci jobs failing,
Started at 2014-09-29 23:00 UTC Approx
Error in compute log:
http://logs.openstack.org/93/124493/12/check-tripleo/check-tripleo-ironic-undercloud-precise-nonha/2bb28b6/logs/seed_logs/nova-compute.txt.gz
WARNING nova.virt.ironic.client_wrapper [req-03f2175d-d45a-44ad-9eb2-13d9a9671bb6 None] Error contacting Ironic server for 'node.update'. Attempt 58 of 60
WARNING ironicclient.common.http [req-03f2175d-d45a-44ad-9eb2-13d9a9671bb6 ] Request returned failure status.
WARNING nova.virt.ironic.client_wrapper [req-03f2175d-d45a-44ad-9eb2-13d9a9671bb6 None] Error contacting Ironic server for 'node.update'. Attempt 59 of 60
WARNING ironicclient.common.http [req-03f2175d-d45a-44ad-9eb2-13d9a9671bb6 ] Request returned failure status.
ERROR nova.virt.ironic.client_wrapper [req-03f2175d-d45a-44ad-9eb2-13d9a9671bb6 None] Error contacting Ironic server for 'node.update'. Attempt 60 of 60
ERROR nova.compute.manager [req-03f2175d-d45a-44ad-9eb2-13d9a9671bb6 None] [instance: be479e2b-2a6c-4973-8472-ce1e09c2439f] Instance failed to spawn
TRACE nova.compute.manager [instance: be479e2b-2a6c-4973-8472-ce1e09c2439f] Traceback (most recent call last):
TRACE nova.compute.manager [instance: be479e2b-2a6c-4973-8472-ce1e09c2439f] File "/opt/stack/venvs/nova/local/lib/python2.7/site-packages/nova/compute/manager.py", line 2231, in _build_resources
TRACE nova.compute.manager [instance: be479e2b-2a6c-4973-8472-ce1e09c2439f] yield resources
TRACE nova.compute.manager [instance: be479e2b-2a6c-4973-8472-ce1e09c2439f] File "/opt/stack/venvs/nova/local/lib/python2.7/site-packages/nova/compute/manager.py", line 2101, in _build_and_run_instance
TRACE nova.compute.manager [instance: be479e2b-2a6c-4973-8472-ce1e09c2439f] block_device_info=block_device_info)
TRACE nova.compute.manager [instance: be479e2b-2a6c-4973-8472-ce1e09c2439f] File "/opt/stack/venvs/nova/local/lib/python2.7/site-packages/nova/virt/ironic/driver.py", line 595, in spawn
TRACE nova.compute.manager [instance: be479e2b-2a6c-4973-8472-ce1e09c2439f] self._add_driver_fields(node, instance, image_meta, flavor)
TRACE nova.compute.manager [instance: be479e2b-2a6c-4973-8472-ce1e09c2439f] File "/opt/stack/venvs/nova/local/lib/python2.7/site-packages/nova/virt/ironic/driver.py", line 290, in _add_driver_fields
TRACE nova.compute.manager [instance: be479e2b-2a6c-4973-8472-ce1e09c2439f] icli.call('node.update', node.uuid, patch)
TRACE nova.compute.manager [instance: be479e2b-2a6c-4973-8472-ce1e09c2439f] File "/opt/stack/venvs/nova/local/lib/python2.7/site-packages/nova/virt/ironic/client_wrapper.py", line 120, in call
TRACE nova.compute.manager [instance: be479e2b-2a6c-4973-8472-ce1e09c2439f] raise exception.NovaException(msg)
TRACE nova.compute.manager [instance: be479e2b-2a6c-4973-8472-ce1e09c2439f] NovaException: Error contacting Ironic server for 'node.update'. Attempt 60 of 60
TRACE nova.compute.manager [instance: be479e2b-2a6c-4973-8472-ce1e09c2439f]
WARNING nova.virt.ironic.driver [req-03f2175d-d45a-44ad-9eb2-13d9a9671bb6 None] Destroy called on non-existing instance be479e2b-2a6c-4973-8472-ce1e09c2439f.
In Ironic API logs:
Sep 29 10:11:08 ubuntu ironic-api: 2014-09-29 10:11:08.593 3702 WARNING wsme.api [-] Client-side error: Node ddf182cf- ed69-4100- bce5-fba864dffe 4d can not be updated while a state transition is in progress. ddf182cf- ed69-4100- bce5-fba864dffe 4d HTTP/1.1" 409 193 ed69-4100- bce5-fba864dffe 4d can not be updated while a state transition is in progress. ddf182cf- ed69-4100- bce5-fba864dffe 4d HTTP/1.1" 409 193 ed69-4100- bce5-fba864dffe 4d can not be updated while a state transition is in progress.
Sep 29 10:11:08 ubuntu ironic-api: 192.0.2.1 - - [29/Sep/2014 10:11:08] "PATCH /v1/nodes/
Sep 29 10:11:10 ubuntu ironic-api: 2014-09-29 10:11:10.969 3702 WARNING wsme.api [-] Client-side error: Node ddf182cf-
Sep 29 10:11:10 ubuntu ironic-api: 192.0.2.1 - - [29/Sep/2014 10:11:10] "PATCH /v1/nodes/
Sep 29 10:11:13 ubuntu ironic-api: 2014-09-29 10:11:13.252 3702 WARNING wsme.api [-] Client-side error: Node ddf182cf-
....and so on.
We suspect https:/ /review. openstack. org/#/c/ 124225/ may be the cause, trying revert in https:/ /review. openstack. org/#/c/ 124990/