Comment 1 for bug 1690373

Revision history for this message
Michele Baldessari (michele) wrote :

Looking at:
logs.openstack.org/85/463985/2/check/gate-tripleo-ci-centos-7-nonha-multinode-oooq/1067ca3/logs

The last operation from subnode-2 is the following:
May 12 06:56:56 centos-7-2-node-rax-ord-8800008-577002 os-collect-config: [2017-05-12 06:56:56,622] (os-refresh-config) [INFO] Completed phase post-configure

May 12 06:56:53 centos-7-2-node-rax-ord-8800008-577002 os-collect-config: ++ curl -s -w '%{http_code}' -X POST -H 'Content-Type: application/json' -o /tmp/tmp.sj2FHanUyh --data-binary '{"deploy_stdout": "os-apply-config deployment 0e65c5de-e457-440a-9c09-4a51723bf532 completed", "deploy_status_code": "0"}' 'http://192.168.24.1:8000/v1/signal/arn%3Aopenstack%3Aheat%3A%3A70e0eeedf2db4f6790f3bbb50c76e223%3Astacks%2Fovercloud-Controller-varysnuo6nyx-0-whpm4maqfizz-Controller-ppc2knqolux2%2F04dc8360-364d-4647-a7c2-90965a94140e%2Fresources%2FInstanceIdDeployment?Timestamp=2017-05-12T06%3A56%3A29Z&SignatureMethod=HmacSHA256&AWSAccessKeyId=a58bdc75277f472aa7690c18184b164f&SignatureVersion=2&Signature=vQ2ZL0avTlmW%2Fzoba4%2BaSQR1i0PofnnpKbCScL6ZBZw%3D'
May 12 06:56:56 centos-7-2-node-rax-ord-8800008-577002 os-collect-config: + status=200

On the undercloud we see the following:
From heat-api-cfn.log:
2017-05-12 06:56:56.594 4047 INFO eventlet.wsgi.server [req-e2a159fe-3806-45a2-9972-4d510300d84a 47c98f069b0d4633a7a17156c6ba04e1 4af916985a214dd5955f7ed87971413c - 6e59edb5d4cf4033b30791e96cfcb654 6e59edb5d4cf4033b30791e96cfcb654] 192.168.24.3 - - [12/May/2017 06:56:56] "POST /v1/signal/arn%3Aopenstack%3Aheat%3A%3A70e0eeedf2db4f6790f3bbb50c76e223%3Astacks%2Fovercloud-Controller-varysnuo6nyx-0-whpm4maqfizz-Controller-ppc2knqolux2%2F04dc8360-364d-4647-a7c2-90965a94140e%2Fresources%2FInstanceIdDeployment?Timestamp=2017-05-12T06%3A56%3A29Z&SignatureMethod=HmacSHA256&AWSAccessKeyId=a58bdc75277f472aa7690c18184b164f&SignatureVersion=2&Signature=vQ2ZL0avTlmW%2Fzoba4%2BaSQR1i0PofnnpKbCScL6ZBZw%3D HTTP/1.1" 200 115 3.394451

If I keep tracing that request (req-e2a159fe-3806-45a2-9972-4d510300d84a) through heat services I see:
undercloud/var/log/heat/heat-engine.log.txt.gz:2017-05-12 06:56:55.068 4058 DEBUG heat.engine.clients.keystoneclient [req-e2a159fe-3806-45a2-9972-4d510300d84a - 70e0eeedf2db4f6790f3bbb50c76e223 - - -] Using stack domain heat_stack __init__ /usr/lib/python2.7/site-packages/heat/engine/clients/os/keystone/heat_keystoneclient.py:98
undercloud/var/log/heat/heat-engine.log.txt.gz:2017-05-12 06:56:54.159 4058 WARNING oslo_config.cfg [req-e2a159fe-3806-45a2-9972-4d510300d84a - 70e0eeedf2db4f6790f3bbb50c76e223 - - -] Option "auth_plugin" from group "trustee" is deprecated. Use option "auth_type" from group "trustee".
undercloud/var/log/heat/heat-engine.log.txt.gz:2017-05-12 06:56:56.589 4058 DEBUG oslo_messaging._drivers.amqpdriver [req-e2a159fe-3806-45a2-9972-4d510300d84a - 70e0eeedf2db4f6790f3bbb50c76e223 - - -] sending reply msg_id: 96c04daa60ea49fc956ddc2c81c2325a reply queue: reply_f730340a29fd463aadc604f3bb6d3445 time elapsed: 2.478203106s _send_reply /usr/lib/python2.7/site-packages/oslo_messaging/_drivers/amqpdriver.py:73
undercloud/var/log/heat/heat-engine.log.txt.gz:2017-05-12 06:56:56.591 4058 DEBUG heat.engine.service [req-e2a159fe-3806-45a2-9972-4d510300d84a - 70e0eeedf2db4f6790f3bbb50c76e223 - - -] signaling resource overcloud-Controller-varysnuo6nyx-0-whpm4maqfizz-Controller-ppc2knqolux2:InstanceIdDeployment _resource_signal /usr/lib/python2.7/site-packages/heat/engine/service.py:1807
undercloud/var/log/heat/heat-engine.log.txt.gz:2017-05-12 06:56:56.642 4058 DEBUG oslo_messaging._drivers.amqpdriver [req-e2a159fe-3806-45a2-9972-4d510300d84a - 70e0eeedf2db4f6790f3bbb50c76e223 - - -] CALL msg_id: 86f08b9195a74c1182567c7789485f8b exchange 'heat' topic 'engine' _send /usr/lib/python2.7/site-packages/oslo_messaging/_drivers/amqpdriver.py:448
undercloud/var/log/heat/heat-engine.log.txt.gz:2017-05-12 06:56:56.925 4057 DEBUG oslo_messaging._drivers.amqpdriver [req-e2a159fe-3806-45a2-9972-4d510300d84a - 70e0eeedf2db4f6790f3bbb50c76e223 - - -] sending reply msg_id: 86f08b9195a74c1182567c7789485f8b reply queue: reply_dc1ca8984ed1486194739f1e78e14688 time elapsed: 0.0253046449998s _send_reply /usr/lib/python2.7/site-packages/oslo_messaging/_drivers/amqpdriver.py:73
undercloud/var/log/heat/heat-engine.log.txt.gz:2017-05-12 06:56:56.930 4058 DEBUG oslo_messaging._drivers.amqpdriver [req-e2a159fe-3806-45a2-9972-4d510300d84a - 70e0eeedf2db4f6790f3bbb50c76e223 - - -] CALL msg_id: 2e57fed65172405da6edd1e9b8bfda7d exchange 'heat' topic 'engine' _send /usr/lib/python2.7/site-packages/oslo_messaging/_drivers/amqpdriver.py:448
undercloud/var/log/heat/heat-engine.log.txt.gz:2017-05-12 06:56:56.965 4058 DEBUG oslo_messaging._drivers.amqpdriver [req-e2a159fe-3806-45a2-9972-4d510300d84a - 70e0eeedf2db4f6790f3bbb50c76e223 - - -] sending reply msg_id: 2e57fed65172405da6edd1e9b8bfda7d reply queue: reply_dc1ca8984ed1486194739f1e78e14688 time elapsed: 0.032641324s _send_reply /usr/lib/python2.7/site-packages/oslo_messaging/_drivers/amqpdriver.py:73

An error I did notice is the following:
heat-engine.log.txt.gz:2017-05-12 06:56:23.397 4058 DEBUG glanceclient.common.http [req-f1f9166e-9152-4039-930a-702f30fa869a da857f112c104b7b82c0e80eb58c648c 70e0eeedf2db4f6790f3bbb50c76e223 - - -] Request returned failure status 404. _handle_response /usr/lib/python2.7/site-packages/glanceclient/common/http.py:106

Potentially we have some glance/swift issues?
api.log.txt.gz:2017-05-12 06:53:16.014 3560 INFO swiftclient [req-881cc954-1238-4240-b363-fb3191f199c9 da857f112c104b7b82c0e80eb58c648c 70e0eeedf2db4f6790f3bbb50c76e223 - default default] RESP STATUS: 404 Not Found
api.log.txt.gz:2017-05-12 06:53:16.015 3560 ERROR swiftclient [req-881cc954-1238-4240-b363-fb3191f199c9 da857f112c104b7b82c0e80eb58c648c 70e0eeedf2db4f6790f3bbb50c76e223 - default default] Container HEAD failed: http://192.168.24.1:8080/v1/AUTH_8737cddce65d4cd18483a8785bf517bd/glance 404 Not Found
api.log.txt.gz:2017-05-12 06:53:16.015 3560 ERROR swiftclient ClientException: Container HEAD failed: http://192.168.24.1:8080/v1/AUTH_8737cddce65d4cd18483a8785bf517bd/glance 404 Not Found
api.log.txt.gz:2017-05-12 06:54:16.195 3561 INFO eventlet.wsgi.server [req-4519b27d-8381-4c8c-af76-7001c5316ca3 da857f112c104b7b82c0e80eb58c648c 70e0eeedf2db4f6790f3bbb50c76e223 - default default] 192.168.24.1 - - [12/May/2017 06:54:16] "HEAD /v1/images/bm-deploy-kernel HTTP/1.1" 404 213 0.641636
api.log.txt.gz:2017-05-12 06:54:16.225 3561 INFO eventlet.wsgi.server [req-cd136535-2a9e-4a5c-80c0-ad35813575fa da857f112c104b7b82c0e80eb58c648c 70e0eeedf2db4f6790f3bbb50c76e223 - default default] 192.168.24.1 - - [12/May/2017 06:54:16] "HEAD /v1/images/bm-deploy-kernel HTTP/1.1" 404 213 0.025524
api.log.txt.gz:2017-05-12 06:54:16.315 3561 INFO eventlet.wsgi.server [req-fa4bb63b-6119-42b5-bcd0-db422ca28d51 da857f112c104b7b82c0e80eb58c648c 70e0eeedf2db4f6790f3bbb50c76e223 - default default] 192.168.24.1 - - [12/May/2017 06:54:16] "HEAD /v1/images/bm-deploy-ramdisk HTTP/1.1" 404 213 0.026485
api.log.txt.gz:2017-05-12 06:54:16.347 3561 INFO eventlet.wsgi.server [req-654f8c67-718f-43fc-8b7f-4dd9ac59885f da857f112c104b7b82c0e80eb58c648c 70e0eeedf2db4f6790f3bbb50c76e223 - default default] 192.168.24.1 - - [12/May/2017 06:54:16] "HEAD /v1/images/bm-deploy-ramdisk HTTP/1.1" 404 213 0.028838
api.log.txt.gz:2017-05-12 06:56:23.395 3560 INFO eventlet.wsgi.server [req-09b61bbf-b265-4b51-8621-a43696ddfbcd da857f112c104b7b82c0e80eb58c648c 70e0eeedf2db4f6790f3bbb50c76e223 - default default] 192.168.24.1 - - [12/May/2017 06:56:23] "HEAD /v1/images/overcloud-full HTTP/1.1" 404 213 0.315360