No valid host was found error

Bug #1723155 reported by José Donoso
10
This bug affects 2 people
Affects Status Importance Assigned to Milestone
kolla-ansible
Expired
Undecided
Unassigned

Bug Description

Description
===========
When i try to launch an instance the following error appear: "No valid host was found"

Expected result
===============
Instance succesfully launched

Actual result
=============
Failed to perform requested operation on instance "c", the instance has an error status: Please try again later [Error: No valid host was found. ]

Environment
===========
I deployed Kolla-Ansible Openstack from binary with yum so is the latest version.
I have one controller network monitoring and compute node, one compute-only node and one storage-only node.
I have two interfaces in every node, one is the kolla internal and external interface and one is the neutron external interface

Im using kvm as hypervisor
I installed libvirtd service in the controller node after the deployment and is the only node with libvirtd installed

Ceph as backend

Logs & Configs
==============
In nova-api.log

2017-10-12 12:14:00.942 25 DEBUG nova.osapi_compute.wsgi.server [req-10b4d056-6933-4fa6-ba9f-162bc65945fd - - - - -] (25) accepted ('172.30.220.3', 55626) server /usr/lib/python2.7/site-packages/eventlet/wsgi.py:883
2017-10-12 12:14:00.956 25 DEBUG nova.api.openstack.wsgi [req-efa73f8e-c7c6-4e0e-bf1a-712b7d5afa29 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Action: 'create', calling method: <bound method ServersController.create of <nova.api.openstack.compute.servers.ServersController object at 0x848bad0>>, body: {"server": {"name": "c", "imageRef": "", "availability_zone": "nova", "key_name": "mykey", "flavorRef": "3", "OS-DCF:diskConfig": "AUTO", "max_count": 1, "block_device_mapping_v2": [{"boot_index": "0", "uuid": "e8e6918b-20cc-4d79-905a-6556f6b0e35c", "volume_size": 40, "device_name": "vda", "source_type": "image", "destination_type": "volume", "delete_on_termination": false}], "min_count": 1, "networks": [{"uuid": "c539fd11-f789-43d0-81f1-d3809746aa8e"}], "security_groups": [{"name": "9203d5cb-7ee5-4f34-a600-867e29b1a48e"}]}} _process_stack /usr/lib/python2.7/site-packages/nova/api/openstack/wsgi.py:609
2017-10-12 12:14:01.014 25 DEBUG oslo_concurrency.lockutils [req-efa73f8e-c7c6-4e0e-bf1a-712b7d5afa29 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:270
2017-10-12 12:14:01.014 25 DEBUG oslo_concurrency.lockutils [req-efa73f8e-c7c6-4e0e-bf1a-712b7d5afa29 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Lock "00000000-0000-0000-0000-000000000000" released by "nova.context.get_or_set_cached_cell_and_set_connections" :: held 0.001s inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:282
2017-10-12 12:14:01.015 25 DEBUG oslo_concurrency.lockutils [req-efa73f8e-c7c6-4e0e-bf1a-712b7d5afa29 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Lock "e4e34c46-c1ef-4480-84a6-9dd3996a0a82" acquired by "nova.context.get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:270
2017-10-12 12:14:01.016 25 DEBUG oslo_concurrency.lockutils [req-efa73f8e-c7c6-4e0e-bf1a-712b7d5afa29 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Lock "e4e34c46-c1ef-4480-84a6-9dd3996a0a82" released by "nova.context.get_or_set_cached_cell_and_set_connections" :: held 0.001s inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:282
2017-10-12 12:14:01.037 25 DEBUG oslo_concurrency.lockutils [req-efa73f8e-c7c6-4e0e-bf1a-712b7d5afa29 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:270
2017-10-12 12:14:01.038 25 DEBUG oslo_concurrency.lockutils [req-efa73f8e-c7c6-4e0e-bf1a-712b7d5afa29 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Lock "00000000-0000-0000-0000-000000000000" released by "nova.context.get_or_set_cached_cell_and_set_connections" :: held 0.001s inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:282
2017-10-12 12:14:01.058 25 DEBUG oslo_concurrency.lockutils [req-efa73f8e-c7c6-4e0e-bf1a-712b7d5afa29 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Lock "e4e34c46-c1ef-4480-84a6-9dd3996a0a82" acquired by "nova.context.get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:270
2017-10-12 12:14:01.058 25 DEBUG oslo_concurrency.lockutils [req-efa73f8e-c7c6-4e0e-bf1a-712b7d5afa29 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Lock "e4e34c46-c1ef-4480-84a6-9dd3996a0a82" released by "nova.context.get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:282
2017-10-12 12:14:01.509 25 DEBUG nova.network.neutronv2.api [req-efa73f8e-c7c6-4e0e-bf1a-712b7d5afa29 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] validate_networks() for [(u'c539fd11-f789-43d0-81f1-d3809746aa8e', None, None, None)] validate_networks /usr/lib/python2.7/site-packages/nova/network/neutronv2/api.py:1679
2017-10-12 12:14:01.706 25 DEBUG nova.virt.hardware [req-efa73f8e-c7c6-4e0e-bf1a-712b7d5afa29 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] emulator threads policy constraint: None get_emulator_threads_constraint /usr/lib/python2.7/site-packages/nova/virt/hardware.py:1287
2017-10-12 12:14:01.718 25 DEBUG nova.quota [req-efa73f8e-c7c6-4e0e-bf1a-712b7d5afa29 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Getting quotas for project 189671b052b34af8b34edbf9a3f87e76. Resources: ['metadata_items'] _get_quotas /usr/lib/python2.7/site-packages/nova/quota.py:442
2017-10-12 12:14:01.723 25 DEBUG nova.quota [req-efa73f8e-c7c6-4e0e-bf1a-712b7d5afa29 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Getting quotas for user d45a3d94c6fc4e34b7b9d6bbab8277b4 and project 189671b052b34af8b34edbf9a3f87e76. Resources: ['metadata_items'] _get_quotas /usr/lib/python2.7/site-packages/nova/quota.py:434
2017-10-12 12:14:01.745 25 DEBUG nova.quota [req-efa73f8e-c7c6-4e0e-bf1a-712b7d5afa29 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Getting quotas for project 189671b052b34af8b34edbf9a3f87e76. Resources: ['injected_files'] _get_quotas /usr/lib/python2.7/site-packages/nova/quota.py:442
2017-10-12 12:14:01.749 25 DEBUG nova.quota [req-efa73f8e-c7c6-4e0e-bf1a-712b7d5afa29 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Getting quotas for user d45a3d94c6fc4e34b7b9d6bbab8277b4 and project 189671b052b34af8b34edbf9a3f87e76. Resources: ['injected_files'] _get_quotas /usr/lib/python2.7/site-packages/nova/quota.py:434
2017-10-12 12:14:01.769 25 DEBUG nova.quota [req-efa73f8e-c7c6-4e0e-bf1a-712b7d5afa29 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Getting quotas for project 189671b052b34af8b34edbf9a3f87e76. Resources: ['injected_file_content_bytes', 'injected_file_path_bytes'] _get_quotas /usr/lib/python2.7/site-packages/nova/quota.py:442
2017-10-12 12:14:01.773 25 DEBUG nova.quota [req-efa73f8e-c7c6-4e0e-bf1a-712b7d5afa29 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Getting quotas for user d45a3d94c6fc4e34b7b9d6bbab8277b4 and project 189671b052b34af8b34edbf9a3f87e76. Resources: ['injected_file_content_bytes', 'injected_file_path_bytes'] _get_quotas /usr/lib/python2.7/site-packages/nova/quota.py:434
2017-10-12 12:14:01.785 25 DEBUG oslo_concurrency.lockutils [req-efa73f8e-c7c6-4e0e-bf1a-712b7d5afa29 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:270
2017-10-12 12:14:01.786 25 DEBUG oslo_concurrency.lockutils [req-efa73f8e-c7c6-4e0e-bf1a-712b7d5afa29 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Lock "00000000-0000-0000-0000-000000000000" released by "nova.context.get_or_set_cached_cell_and_set_connections" :: held 0.001s inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:282
2017-10-12 12:14:01.786 25 DEBUG oslo_concurrency.lockutils [req-efa73f8e-c7c6-4e0e-bf1a-712b7d5afa29 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Lock "e4e34c46-c1ef-4480-84a6-9dd3996a0a82" acquired by "nova.context.get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:270
2017-10-12 12:14:01.787 25 DEBUG oslo_concurrency.lockutils [req-efa73f8e-c7c6-4e0e-bf1a-712b7d5afa29 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Lock "e4e34c46-c1ef-4480-84a6-9dd3996a0a82" released by "nova.context.get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:282
2017-10-12 12:14:01.806 25 DEBUG nova.quota [req-efa73f8e-c7c6-4e0e-bf1a-712b7d5afa29 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Getting quotas for project 189671b052b34af8b34edbf9a3f87e76. Resources: set(['instances', 'ram', 'cores']) _get_quotas /usr/lib/python2.7/site-packages/nova/quota.py:442
2017-10-12 12:14:01.811 25 DEBUG nova.quota [req-efa73f8e-c7c6-4e0e-bf1a-712b7d5afa29 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Getting quotas for user d45a3d94c6fc4e34b7b9d6bbab8277b4 and project 189671b052b34af8b34edbf9a3f87e76. Resources: set(['instances', 'ram', 'cores']) _get_quotas /usr/lib/python2.7/site-packages/nova/quota.py:434
2017-10-12 12:14:01.824 25 DEBUG nova.compute.api [req-efa73f8e-c7c6-4e0e-bf1a-712b7d5afa29 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Going to run 1 instances... _provision_instances /usr/lib/python2.7/site-packages/nova/compute/api.py:891
2017-10-12 12:14:01.839 25 DEBUG nova.compute.api [req-efa73f8e-c7c6-4e0e-bf1a-712b7d5afa29 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] [instance: 0e33fd82-a847-4b77-8290-01af48faf79f] block_device_mapping [BlockDeviceMapping(attachment_id=<?>,boot_index=0,connection_info=None,created_at=<?>,delete_on_termination=False,deleted=<?>,deleted_at=<?>,destination_type='volume',device_name='/dev/vda',device_type=None,disk_bus=None,guest_format=None,id=<?>,image_id='e8e6918b-20cc-4d79-905a-6556f6b0e35c',instance=<?>,instance_uuid=<?>,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=<?>,volume_id=None,volume_size=40)] _bdm_validate_set_size_and_instance /usr/lib/python2.7/site-packages/nova/compute/api.py:1268
2017-10-12 12:14:01.927 25 INFO nova.osapi_compute.wsgi.server [req-efa73f8e-c7c6-4e0e-bf1a-712b7d5afa29 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] 172.30.220.254,172.30.220.3 "POST /v2.1/189671b052b34af8b34edbf9a3f87e76/os-volumes_boot HTTP/1.1" status: 202 len: 975 time: 0.9833241
2017-10-12 12:14:01.949 25 DEBUG nova.api.openstack.wsgi [req-5d32880e-5b39-40bb-8194-e8c583a45b47 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Calling method '<bound method ServersController.show of <nova.api.openstack.compute.servers.ServersController object at 0x886b6d0>>' _process_stack /usr/lib/python2.7/site-packages/nova/api/openstack/wsgi.py:612
2017-10-12 12:14:01.957 25 DEBUG nova.compute.api [req-5d32880e-5b39-40bb-8194-e8c583a45b47 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] [instance: 0e33fd82-a847-4b77-8290-01af48faf79f] Fetching instance by UUID get /usr/lib/python2.7/site-packages/nova/compute/api.py:2279
2017-10-12 12:14:02.023 25 DEBUG nova.policy [req-5d32880e-5b39-40bb-8194-e8c583a45b47 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Policy check for os_compute_api:os-hide-server-addresses failed with credentials {'service_roles': [], 'user_id': u'd45a3d94c6fc4e34b7b9d6bbab8277b4', 'roles': [u'heat_stack_owner', u'admin'], 'user_domain_id': u'default', 'service_project_id': None, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_domain_id': None, 'is_admin_project': True, 'is_admin': True, 'project_id': u'189671b052b34af8b34edbf9a3f87e76', 'project_domain_id': u'default'} authorize /usr/lib/python2.7/site-packages/nova/policy.py:168
2017-10-12 12:14:02.080 25 INFO nova.osapi_compute.wsgi.server [req-5d32880e-5b39-40bb-8194-e8c583a45b47 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] 172.30.220.254,172.30.220.3 "GET /v2.1/189671b052b34af8b34edbf9a3f87e76/servers/0e33fd82-a847-4b77-8290-01af48faf79f HTTP/1.1" status: 200 len: 1599 time: 0.1470160
2017-10-12 12:14:02.170 25 DEBUG nova.osapi_compute.wsgi.server [req-10b4d056-6933-4fa6-ba9f-162bc65945fd - - - - -] (25) accepted ('172.30.220.3', 55674) server /usr/lib/python2.7/site-packages/eventlet/wsgi.py:883
2017-10-12 12:14:02.170 24 DEBUG nova.osapi_compute.wsgi.server [req-a8cc65c5-82c2-42a1-92e5-2b913e946d96 - - - - -] (24) accepted ('172.30.220.3', 55678) server /usr/lib/python2.7/site-packages/eventlet/wsgi.py:883
2017-10-12 12:14:02.182 24 DEBUG nova.api.openstack.wsgi [req-762e9cba-6142-42a2-bfad-7738c8aaeb91 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Calling method '<bound method FlavorsController.detail of <nova.api.openstack.compute.flavors.FlavorsController object at 0x8146890>>' _process_stack /usr/lib/python2.7/site-packages/nova/api/openstack/wsgi.py:612
2017-10-12 12:14:02.185 25 INFO nova.osapi_compute.wsgi.server [req-a2a8d9a8-21fa-46c3-9457-aee1e12b5aaa d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] 172.30.220.254,172.30.220.3 "GET /v2.1/189671b052b34af8b34edbf9a3f87e76 HTTP/1.1" status: 404 len: 358 time: 0.0133569
2017-10-12 12:14:02.199 25 DEBUG nova.api.openstack.wsgi [req-ae24dfb9-e8ef-4d52-b858-4d8575bca1c6 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Calling method '<bound method VersionsController.show of <nova.api.openstack.compute.versionsV21.VersionsController object at 0x8146050>>' _process_stack /usr/lib/python2.7/site-packages/nova/api/openstack/wsgi.py:612
2017-10-12 12:14:02.201 25 INFO nova.osapi_compute.wsgi.server [req-ae24dfb9-e8ef-4d52-b858-4d8575bca1c6 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] 172.30.220.254,172.30.220.3 "GET /v2.1/ HTTP/1.1" status: 200 len: 763 time: 0.0105250
2017-10-12 12:14:02.230 24 INFO nova.osapi_compute.wsgi.server [req-762e9cba-6142-42a2-bfad-7738c8aaeb91 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] 172.30.220.254,172.30.220.3 "GET /v2.1/189671b052b34af8b34edbf9a3f87e76/flavors/detail HTTP/1.1" status: 200 len: 2495 time: 0.0585120
2017-10-12 12:14:02.285 24 DEBUG nova.osapi_compute.wsgi.server [req-a8cc65c5-82c2-42a1-92e5-2b913e946d96 - - - - -] (24) accepted ('172.30.220.3', 55696) server /usr/lib/python2.7/site-packages/eventlet/wsgi.py:883
2017-10-12 12:14:02.363 24 DEBUG nova.api.openstack.wsgi [req-f9b96e2b-4b07-4d4e-8415-21c6f51b51bb d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Calling method '<bound method ServersController.detail of <nova.api.openstack.compute.servers.ServersController object at 0x872c390>>' _process_stack /usr/lib/python2.7/site-packages/nova/api/openstack/wsgi.py:612
2017-10-12 12:14:02.371 24 DEBUG nova.compute.api [req-f9b96e2b-4b07-4d4e-8415-21c6f51b51bb d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Searching by: {'deleted': False, u'project_id': u'189671b052b34af8b34edbf9a3f87e76'} get_all /usr/lib/python2.7/site-packages/nova/compute/api.py:2311
2017-10-12 12:14:02.389 24 DEBUG oslo_concurrency.lockutils [req-f9b96e2b-4b07-4d4e-8415-21c6f51b51bb d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:270
2017-10-12 12:14:02.390 24 DEBUG oslo_concurrency.lockutils [req-f9b96e2b-4b07-4d4e-8415-21c6f51b51bb d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Lock "00000000-0000-0000-0000-000000000000" released by "nova.context.get_or_set_cached_cell_and_set_connections" :: held 0.001s inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:282
2017-10-12 12:14:02.438 24 DEBUG nova.compute.api [req-f9b96e2b-4b07-4d4e-8415-21c6f51b51bb d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Skipping already-collected cell0 list _get_instances_by_filters_all_cells /usr/lib/python2.7/site-packages/nova/compute/api.py:2503
2017-10-12 12:14:02.439 24 DEBUG nova.compute.api [req-f9b96e2b-4b07-4d4e-8415-21c6f51b51bb d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Listing 20 instances in cell e4e34c46-c1ef-4480-84a6-9dd3996a0a82 _get_instances_by_filters_all_cells /usr/lib/python2.7/site-packages/nova/compute/api.py:2506
2017-10-12 12:14:02.439 24 DEBUG oslo_concurrency.lockutils [req-f9b96e2b-4b07-4d4e-8415-21c6f51b51bb d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Lock "e4e34c46-c1ef-4480-84a6-9dd3996a0a82" acquired by "nova.context.get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:270
2017-10-12 12:14:02.440 24 DEBUG oslo_concurrency.lockutils [req-f9b96e2b-4b07-4d4e-8415-21c6f51b51bb d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Lock "e4e34c46-c1ef-4480-84a6-9dd3996a0a82" released by "nova.context.get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:282
2017-10-12 12:14:02.508 24 DEBUG nova.policy [req-f9b96e2b-4b07-4d4e-8415-21c6f51b51bb d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Policy check for os_compute_api:os-hide-server-addresses failed with credentials {'service_roles': [], 'user_id': u'd45a3d94c6fc4e34b7b9d6bbab8277b4', 'roles': [u'heat_stack_owner', u'admin'], 'user_domain_id': u'default', 'service_project_id': None, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_domain_id': None, 'is_admin_project': True, 'is_admin': True, 'project_id': u'189671b052b34af8b34edbf9a3f87e76', 'project_domain_id': u'default'} authorize /usr/lib/python2.7/site-packages/nova/policy.py:168
2017-10-12 12:14:02.666 24 INFO nova.osapi_compute.wsgi.server [req-f9b96e2b-4b07-4d4e-8415-21c6f51b51bb d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] 172.30.220.254,172.30.220.3 "GET /v2.1/189671b052b34af8b34edbf9a3f87e76/servers/detail?limit=21&project_id=189671b052b34af8b34edbf9a3f87e76 HTTP/1.1" status: 200 len: 1932 time: 0.3798020
2017-10-12 12:14:03.362 24 DEBUG nova.api.openstack.wsgi [req-d392ac5e-7c85-4ab6-b4ed-f19064747e71 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Calling method '<function version_select at 0x9f00050>' _process_stack /usr/lib/python2.7/site-packages/nova/api/openstack/wsgi.py:612
2017-10-12 12:14:03.400 24 DEBUG oslo_concurrency.lockutils [req-d392ac5e-7c85-4ab6-b4ed-f19064747e71 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:270
2017-10-12 12:14:03.401 24 DEBUG oslo_concurrency.lockutils [req-d392ac5e-7c85-4ab6-b4ed-f19064747e71 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Lock "00000000-0000-0000-0000-000000000000" released by "nova.context.get_or_set_cached_cell_and_set_connections" :: held 0.001s inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:282
2017-10-12 12:14:03.402 24 DEBUG oslo_concurrency.lockutils [req-d392ac5e-7c85-4ab6-b4ed-f19064747e71 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Lock "e4e34c46-c1ef-4480-84a6-9dd3996a0a82" acquired by "nova.context.get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:270
2017-10-12 12:14:03.402 24 DEBUG oslo_concurrency.lockutils [req-d392ac5e-7c85-4ab6-b4ed-f19064747e71 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Lock "e4e34c46-c1ef-4480-84a6-9dd3996a0a82" released by "nova.context.get_or_set_cached_cell_and_set_connections" :: held 0.001s inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:282
2017-10-12 12:14:03.435 24 INFO nova.osapi_compute.wsgi.server [req-d392ac5e-7c85-4ab6-b4ed-f19064747e71 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] 172.30.220.254,172.30.220.3 "GET /v2.1/189671b052b34af8b34edbf9a3f87e76/limits?reserved=1 HTTP/1.1" status: 200 len: 889 time: 0.0831459
2017-10-12 12:14:03.533 24 INFO nova.osapi_compute.wsgi.server [req-63d4e421-fafa-499c-ba22-364a387e26e0 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] 172.30.220.254,172.30.220.3 "GET /v2.1/189671b052b34af8b34edbf9a3f87e76 HTTP/1.1" status: 404 len: 358 time: 0.0092881
2017-10-12 12:14:03.544 24 DEBUG nova.api.openstack.wsgi [req-bdd4a3f5-e098-462b-a4e7-bff2926b0dfc d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Calling method '<bound method VersionsController.show of <nova.api.openstack.compute.versionsV21.VersionsController object at 0x8146050>>' _process_stack /usr/lib/python2.7/site-packages/nova/api/openstack/wsgi.py:612
2017-10-12 12:14:03.549 24 INFO nova.osapi_compute.wsgi.server [req-bdd4a3f5-e098-462b-a4e7-bff2926b0dfc d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] 172.30.220.254,172.30.220.3 "GET /v2.1/ HTTP/1.1" status: 200 len: 763 time: 0.0131600
2017-10-12 12:14:03.602 24 DEBUG nova.api.openstack.wsgi [req-9107cb8a-2741-4e1b-99aa-efe4c56ed8bb d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Calling method '<function version_select at 0x9eda5f0>' _process_stack /usr/lib/python2.7/site-packages/nova/api/openstack/wsgi.py:612
2017-10-12 12:14:03.635 24 DEBUG oslo_concurrency.lockutils [req-9107cb8a-2741-4e1b-99aa-efe4c56ed8bb d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:270
2017-10-12 12:14:03.635 24 DEBUG oslo_concurrency.lockutils [req-9107cb8a-2741-4e1b-99aa-efe4c56ed8bb d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Lock "00000000-0000-0000-0000-000000000000" released by "nova.context.get_or_set_cached_cell_and_set_connections" :: held 0.001s inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:282
2017-10-12 12:14:03.636 24 DEBUG oslo_concurrency.lockutils [req-9107cb8a-2741-4e1b-99aa-efe4c56ed8bb d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Lock "e4e34c46-c1ef-4480-84a6-9dd3996a0a82" acquired by "nova.context.get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:270
2017-10-12 12:14:03.637 24 DEBUG oslo_concurrency.lockutils [req-9107cb8a-2741-4e1b-99aa-efe4c56ed8bb d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Lock "e4e34c46-c1ef-4480-84a6-9dd3996a0a82" released by "nova.context.get_or_set_cached_cell_and_set_connections" :: held 0.001s inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:282
2017-10-12 12:14:03.666 24 INFO nova.osapi_compute.wsgi.server [req-9107cb8a-2741-4e1b-99aa-efe4c56ed8bb d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] 172.30.220.254,172.30.220.3 "GET /v2.1/189671b052b34af8b34edbf9a3f87e76/limits?reserved=1 HTTP/1.1" status: 200 len: 889 time: 0.0745518
2017-10-12 12:14:04.702 24 INFO nova.osapi_compute.wsgi.server [req-4a792988-e9f8-4a33-a8c7-4c09f0dacbd5 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] 172.30.220.254,172.30.220.3 "GET /v2.1/189671b052b34af8b34edbf9a3f87e76 HTTP/1.1" status: 404 len: 358 time: 0.0117171
2017-10-12 12:14:04.716 24 DEBUG nova.api.openstack.wsgi [req-d76f4ea3-ab01-4b6b-8581-ab64bbb6dfcd d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Calling method '<bound method VersionsController.show of <nova.api.openstack.compute.versionsV21.VersionsController object at 0x8146050>>' _process_stack /usr/lib/python2.7/site-packages/nova/api/openstack/wsgi.py:612
2017-10-12 12:14:04.718 24 INFO nova.osapi_compute.wsgi.server [req-d76f4ea3-ab01-4b6b-8581-ab64bbb6dfcd d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] 172.30.220.254,172.30.220.3 "GET /v2.1/ HTTP/1.1" status: 200 len: 763 time: 0.0107520
2017-10-12 12:14:04.811 26 DEBUG nova.osapi_compute.wsgi.server [req-3f2dff86-6e4f-4423-ae35-a3d0c93b5a02 - - - - -] (26) accepted ('172.30.220.3', 55820) server /usr/lib/python2.7/site-packages/eventlet/wsgi.py:883
2017-10-12 12:14:04.905 26 DEBUG nova.api.openstack.wsgi [req-55f99f98-edd1-43ff-8a05-476f3ca6abbf d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Calling method '<bound method ServersController.show of <nova.api.openstack.compute.servers.ServersController object at 0x886b6d0>>' _process_stack /usr/lib/python2.7/site-packages/nova/api/openstack/wsgi.py:612
2017-10-12 12:14:04.910 26 DEBUG nova.compute.api [req-55f99f98-edd1-43ff-8a05-476f3ca6abbf d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] [instance: 0e33fd82-a847-4b77-8290-01af48faf79f] Fetching instance by UUID get /usr/lib/python2.7/site-packages/nova/compute/api.py:2279
2017-10-12 12:14:04.921 26 DEBUG oslo_concurrency.lockutils [req-55f99f98-edd1-43ff-8a05-476f3ca6abbf d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:270
2017-10-12 12:14:04.921 26 DEBUG oslo_concurrency.lockutils [req-55f99f98-edd1-43ff-8a05-476f3ca6abbf d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Lock "00000000-0000-0000-0000-000000000000" released by "nova.context.get_or_set_cached_cell_and_set_connections" :: held 0.001s inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:282
2017-10-12 12:14:04.996 26 DEBUG oslo_concurrency.lockutils [req-55f99f98-edd1-43ff-8a05-476f3ca6abbf d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:270
2017-10-12 12:14:04.996 26 DEBUG oslo_concurrency.lockutils [req-55f99f98-edd1-43ff-8a05-476f3ca6abbf d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Lock "00000000-0000-0000-0000-000000000000" released by "nova.context.get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:282
2017-10-12 12:14:04.996 26 DEBUG nova.objects.instance [req-55f99f98-edd1-43ff-8a05-476f3ca6abbf d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Lazy-loading 'fault' on Instance uuid 0e33fd82-a847-4b77-8290-01af48faf79f obj_load_attr /usr/lib/python2.7/site-packages/nova/objects/instance.py:1048
2017-10-12 12:14:05.022 26 DEBUG nova.policy [req-55f99f98-edd1-43ff-8a05-476f3ca6abbf d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Policy check for os_compute_api:os-hide-server-addresses failed with credentials {'service_roles': [], 'user_id': u'd45a3d94c6fc4e34b7b9d6bbab8277b4', 'roles': [u'heat_stack_owner', u'admin'], 'user_domain_id': u'default', 'service_project_id': None, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_domain_id': None, 'is_admin_project': True, 'is_admin': True, 'project_id': u'189671b052b34af8b34edbf9a3f87e76', 'project_domain_id': u'default'} authorize /usr/lib/python2.7/site-packages/nova/policy.py:168
2017-10-12 12:14:05.166 26 INFO nova.osapi_compute.wsgi.server [req-55f99f98-edd1-43ff-8a05-476f3ca6abbf d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] 172.30.220.254,172.30.220.3 "GET /v2.1/189671b052b34af8b34edbf9a3f87e76/servers/0e33fd82-a847-4b77-8290-01af48faf79f HTTP/1.1" status: 200 len: 3516 time: 0.3529608
2017-10-12 12:14:05.184 24 DEBUG nova.api.openstack.wsgi [req-63707545-e8c8-4e03-a55a-fa0776d39f11 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Calling method '<bound method FlavorsController.show of <nova.api.openstack.compute.flavors.FlavorsController object at 0x8146dd0>>' _process_stack /usr/lib/python2.7/site-packages/nova/api/openstack/wsgi.py:612
2017-10-12 12:14:05.204 24 INFO nova.osapi_compute.wsgi.server [req-63707545-e8c8-4e03-a55a-fa0776d39f11 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] 172.30.220.254,172.30.220.3 "GET /v2.1/189671b052b34af8b34edbf9a3f87e76/flavors/3 HTTP/1.1" status: 200 len: 807 time: 0.0318151
2017-10-12 12:14:05.632 24 INFO nova.osapi_compute.wsgi.server [req-d879c0bb-c59a-4820-a300-09ccb7eac5a4 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] 172.30.220.254,172.30.220.3 "GET /v2.1/189671b052b34af8b34edbf9a3f87e76 HTTP/1.1" status: 404 len: 358 time: 0.0108509
2017-10-12 12:14:05.644 24 DEBUG nova.api.openstack.wsgi [req-b6c86537-d323-41b6-b118-fc1ca8ebf8a5 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Calling method '<bound method VersionsController.show of <nova.api.openstack.compute.versionsV21.VersionsController object at 0x8146050>>' _process_stack /usr/lib/python2.7/site-packages/nova/api/openstack/wsgi.py:612
2017-10-12 12:14:05.645 24 INFO nova.osapi_compute.wsgi.server [req-b6c86537-d323-41b6-b118-fc1ca8ebf8a5 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] 172.30.220.254,172.30.220.3 "GET /v2.1/ HTTP/1.1" status: 200 len: 763 time: 0.0101080

And in nova-conductor.log

2017-10-12 12:14:02.686 22 ERROR nova.conductor.manager [req-efa73f8e-c7c6-4e0e-bf1a-712b7d5afa29 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Failed to schedule instances: NoValidHost_Remote: No valid host was found.
Traceback (most recent call last):

  File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/server.py", line 232, in inner
    return func(*args, **kwargs)

  File "/usr/lib/python2.7/site-packages/nova/scheduler/manager.py", line 137, in select_destinations
    raise exception.NoValidHost(reason="")

NoValidHost: No valid host was found.
2017-10-12 12:14:02.686 22 ERROR nova.conductor.manager Traceback (most recent call last):
2017-10-12 12:14:02.686 22 ERROR nova.conductor.manager File "/usr/lib/python2.7/site-packages/nova/conductor/manager.py", line 1027, in schedule_and_build_instances
2017-10-12 12:14:02.686 22 ERROR nova.conductor.manager instance_uuids)
2017-10-12 12:14:02.686 22 ERROR nova.conductor.manager File "/usr/lib/python2.7/site-packages/nova/conductor/manager.py", line 626, in _schedule_instances
2017-10-12 12:14:02.686 22 ERROR nova.conductor.manager request_spec, instance_uuids)
2017-10-12 12:14:02.686 22 ERROR nova.conductor.manager File "/usr/lib/python2.7/site-packages/nova/scheduler/utils.py", line 586, in wrapped
2017-10-12 12:14:02.686 22 ERROR nova.conductor.manager return func(*args, **kwargs)
2017-10-12 12:14:02.686 22 ERROR nova.conductor.manager File "/usr/lib/python2.7/site-packages/nova/scheduler/client/__init__.py", line 52, in select_destinations
2017-10-12 12:14:02.686 22 ERROR nova.conductor.manager instance_uuids)
2017-10-12 12:14:02.686 22 ERROR nova.conductor.manager File "/usr/lib/python2.7/site-packages/nova/scheduler/client/__init__.py", line 37, in __run_method
2017-10-12 12:14:02.686 22 ERROR nova.conductor.manager return getattr(self.instance, __name)(*args, **kwargs)
2017-10-12 12:14:02.686 22 ERROR nova.conductor.manager File "/usr/lib/python2.7/site-packages/nova/scheduler/client/query.py", line 33, in select_destinations
2017-10-12 12:14:02.686 22 ERROR nova.conductor.manager instance_uuids)
2017-10-12 12:14:02.686 22 ERROR nova.conductor.manager File "/usr/lib/python2.7/site-packages/nova/scheduler/rpcapi.py", line 137, in select_destinations
2017-10-12 12:14:02.686 22 ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args)
2017-10-12 12:14:02.686 22 ERROR nova.conductor.manager File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/client.py", line 169, in call
2017-10-12 12:14:02.686 22 ERROR nova.conductor.manager retry=self.retry)
2017-10-12 12:14:02.686 22 ERROR nova.conductor.manager File "/usr/lib/python2.7/site-packages/oslo_messaging/transport.py", line 123, in _send
2017-10-12 12:14:02.686 22 ERROR nova.conductor.manager timeout=timeout, retry=retry)
2017-10-12 12:14:02.686 22 ERROR nova.conductor.manager File "/usr/lib/python2.7/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 578, in send
2017-10-12 12:14:02.686 22 ERROR nova.conductor.manager retry=retry)
2017-10-12 12:14:02.686 22 ERROR nova.conductor.manager File "/usr/lib/python2.7/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 569, in _send
2017-10-12 12:14:02.686 22 ERROR nova.conductor.manager raise result
2017-10-12 12:14:02.686 22 ERROR nova.conductor.manager NoValidHost_Remote: No valid host was found.
2017-10-12 12:14:02.686 22 ERROR nova.conductor.manager Traceback (most recent call last):
2017-10-12 12:14:02.686 22 ERROR nova.conductor.manager
2017-10-12 12:14:02.686 22 ERROR nova.conductor.manager File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/server.py", line 232, in inner
2017-10-12 12:14:02.686 22 ERROR nova.conductor.manager return func(*args, **kwargs)
2017-10-12 12:14:02.686 22 ERROR nova.conductor.manager
2017-10-12 12:14:02.686 22 ERROR nova.conductor.manager File "/usr/lib/python2.7/site-packages/nova/scheduler/manager.py", line 137, in select_destinations
2017-10-12 12:14:02.686 22 ERROR nova.conductor.manager raise exception.NoValidHost(reason="")
2017-10-12 12:14:02.686 22 ERROR nova.conductor.manager
2017-10-12 12:14:02.686 22 ERROR nova.conductor.manager NoValidHost: No valid host was found.
2017-10-12 12:14:02.686 22 ERROR nova.conductor.manager
2017-10-12 12:14:02.686 22 ERROR nova.conductor.manager
2017-10-12 12:14:02.697 22 DEBUG oslo_concurrency.lockutils [req-efa73f8e-c7c6-4e0e-bf1a-712b7d5afa29 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:270
2017-10-12 12:14:02.698 22 DEBUG oslo_concurrency.lockutils [req-efa73f8e-c7c6-4e0e-bf1a-712b7d5afa29 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Lock "00000000-0000-0000-0000-000000000000" released by "nova.context.get_or_set_cached_cell_and_set_connections" :: held 0.001s inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:282
2017-10-12 12:14:02.743 22 WARNING nova.scheduler.utils [req-efa73f8e-c7c6-4e0e-bf1a-712b7d5afa29 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Failed to compute_task_build_instances: No valid host was found.
Traceback (most recent call last):

  File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/server.py", line 232, in inner
    return func(*args, **kwargs)

  File "/usr/lib/python2.7/site-packages/nova/scheduler/manager.py", line 137, in select_destinations
    raise exception.NoValidHost(reason="")

NoValidHost: No valid host was found.
: NoValidHost_Remote: No valid host was found.
Traceback (most recent call last):

  File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/server.py", line 232, in inner
    return func(*args, **kwargs)

  File "/usr/lib/python2.7/site-packages/nova/scheduler/manager.py", line 137, in select_destinations
    raise exception.NoValidHost(reason="")

NoValidHost: No valid host was found.
2017-10-12 12:14:02.744 22 WARNING nova.scheduler.utils [req-efa73f8e-c7c6-4e0e-bf1a-712b7d5afa29 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] [instance: 0e33fd82-a847-4b77-8290-01af48faf79f] Setting instance to ERROR state.: NoValidHost_Remote: No valid host was found.
Traceback (most recent call last):

  File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/server.py", line 232, in inner
    return func(*args, **kwargs)

  File "/usr/lib/python2.7/site-packages/nova/scheduler/manager.py", line 137, in select_destinations
    raise exception.NoValidHost(reason="")

NoValidHost: No valid host was found.

Inventory file

# These initial groups are the only groups required to be modified. The
# additional groups are for more control of the environment.
[control]
# These hostname must be resolvable from your deployment host
172.30.220.3

# The above can also be specified as follows:
#control[01:03] ansible_user=kolla

# The network nodes are where your l3-agent and loadbalancers will run
# This can be the same as a host in the control group
[network]
172.30.220.3

[compute]
172.30.220.3
172.30.220.4

[monitoring]
172.30.220.3

# When compute nodes and control nodes use different interfaces,
# you can specify "api_interface" and other interfaces like below:
#compute01 neutron_external_interface=eth0 api_interface=em1 storage_interface=em1 tunnel_interface=em1

[storage]
172.30.220.5

[deployment]
localhost ansible_connection=local

[targets]
172.30.220.3
172.30.220.4
172.30.220.5

[targets:vars]
ansible_user=*******
ansible_ssh_pass=******

the rest is the same as the multinode default file

Tags: nova
description: updated
Revision history for this message
Eduardo Gonzalez (egonzalez90) wrote :

Please share nova compute logs in compute host.

Changed in kolla-ansible:
status: New → Incomplete
Revision history for this message
Eduardo Gonzalez (egonzalez90) wrote :

Also, share scheduler logs, globals.yml config, and inventory file(change IPs to different one if needed)

Revision history for this message
José Donoso (jose.manuel.akainix) wrote :
Download full text (15.3 KiB)

scheduler.log

2017-10-12 12:14:01.967 7 DEBUG nova.scheduler.manager [req-efa73f8e-c7c6-4e0e-bf1a-712b7d5afa29 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Starting to schedule for instances: [u'0e33fd82-a847-4b77-8290-01af48faf79f'] select_destinations /usr/lib/python2.7/site-packages/nova/scheduler/manager.py:113
2017-10-12 12:14:02.680 7 DEBUG nova.scheduler.manager [req-efa73f8e-c7c6-4e0e-bf1a-712b7d5afa29 d45a3d94c6fc4e34b7b9d6bbab8277b4 189671b052b34af8b34edbf9a3f87e76 - default default] Got no allocation candidates from the Placement API. This may be a temporary occurrence as compute nodes start up and begin reporting inventory to the Placement service. select_destinations /usr/lib/python2.7/site-packages/nova/scheduler/manager.py:133
2017-10-12 12:14:03.754 7 DEBUG oslo_concurrency.lockutils [req-a2d9faea-d259-4315-a3fc-31cfafaaed4c - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.sync_instance_info" :: waited 0.000s inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:270
2017-10-12 12:14:03.755 7 INFO nova.scheduler.host_manager [req-a2d9faea-d259-4315-a3fc-31cfafaaed4c - - - - -] Successfully synced instances from host 'chihiro.akainix.local'.
2017-10-12 12:14:03.755 7 DEBUG oslo_concurrency.lockutils [req-a2d9faea-d259-4315-a3fc-31cfafaaed4c - - - - -] Lock "host_instance" released by "nova.scheduler.host_manager.sync_instance_info" :: held 0.001s inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:282
2017-10-12 12:14:04.057 7 DEBUG oslo_service.periodic_task [req-9d92bc3f-18a0-4481-8fc3-218fe991a94b - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/site-packages/oslo_service/periodic_task.py:215

Inventory file

# These initial groups are the only groups required to be modified. The
# additional groups are for more control of the environment.
[control]
# These hostname must be resolvable from your deployment host
172.30.220.3

# The above can also be specified as follows:
#control[01:03] ansible_user=kolla

# The network nodes are where your l3-agent and loadbalancers will run
# This can be the same as a host in the control group
[network]
172.30.220.3

[compute]
172.30.220.3
172.30.220.4

[monitoring]
172.30.220.3

# When compute nodes and control nodes use different interfaces,
# you can specify "api_interface" and other interfaces like below:
#compute01 neutron_external_interface=eth0 api_interface=em1 storage_interface=em1 tunnel_interface=em1

[storage]
172.30.220.5

[deployment]
localhost ansible_connection=local

[targets]
172.30.220.3
172.30.220.4
172.30.220.5

[targets:vars]
ansible_user=*******
ansible_ssh_pass=******

the rest is the same as the multinode default file

And with the globals.yml file

---
# You can use this file to override _any_ variable throughout Kolla.
# Additional options can be found in the
# 'kolla-ansible/ansible/group_vars/all.yml' file. Default value of all the
# commented parameters are shown here, To override the default value uncomment
# the parameter and change its value.

###############
# Kolla options
###############
# Va...

Revision history for this message
José Donoso (jose.manuel.akainix) wrote :

I review the hypervisor stats and i see that i dont have free disk in my hypervisors

+----------------------+--------+
| Property | Value |
+----------------------+--------+
| count | 2 |
| current_workload | 0 |
| disk_available_least | 0 |
| free_disk_gb | 0 |
| free_ram_mb | 785220 |
| local_gb | 0 |
| local_gb_used | 0 |
| memory_mb | 786244 |
| memory_mb_used | 1024 |
| running_vms | 0 |
| vcpus | 96 |
| vcpus_used | 0 |
+----------------------+--------+

How can i change this if i am using ceph?

description: updated
Revision history for this message
Javier Castillo (javcasalc) wrote :

In a lab deployment, I've got the same problem. I think it's related with ceph/cinder/glance. As result, my only lab hypervisor has "free_disk_gb = 0" (which is an error and should get ceph volume available disk) and it cannot deploy any instance:

$ openstack hypervisor stats show
+----------------------+-------+
| Field | Value |
+----------------------+-------+
| count | 1 |
| current_workload | 0 |
| disk_available_least | 0 |
| free_disk_gb | 0 |
| free_ram_mb | 9487 |
| local_gb | 0 |
| local_gb_used | 0 |
| memory_mb | 9999 |
| memory_mb_used | 512 |
| running_vms | 0 |
| vcpus | 6 |
| vcpus_used | 0 |
+----------------------+-------+

Revision history for this message
Javier Castillo (javcasalc) wrote :
Download full text (3.7 KiB)

In a lab deployment, I've got the same problem. I think it's related with ceph/cinder/glance. As result, my only lab hypervisor has "free_disk_gb = 0" (which is an error and should get ceph volume available disk) and it cannot deploy any instance:

$ openstack hypervisor stats show
+----------------------+-------+
| Field | Value |
+----------------------+-------+
| count | 1 |
| current_workload | 0 |
| disk_available_least | 0 |
| free_disk_gb | 0 |
| free_ram_mb | 9487 |
| local_gb | 0 |
| local_gb_used | 0 |
| memory_mb | 9999 |
| memory_mb_used | 512 |
| running_vms | 0 |
| vcpus | 6 |
| vcpus_used | 0 |
+----------------------+-------+

Checking cinder_volume logs in one storage node, I get:

2017-10-23 08:49:07.768 735 INFO cinder.volume.manager [req-af1815ba-30f8-4a72-8097-756219536c65 - - - - -] Starting volume driver RBDDriver (1.2.0)
2017-10-23 08:49:07.770 735 DEBUG cinder.volume.drivers.rbd [req-af1815ba-30f8-4a72-8097-756219536c65 - - - - -] connecting to ceph (timeout=5). _do_conn /var/lib/kolla/venv/local/lib/python2.7/site-packages/cinder/volume/drivers/rbd.py:308
2017-10-23 08:49:07.794 735 DEBUG cinder.volume.drivers.rbd [req-af1815ba-30f8-4a72-8097-756219536c65 - - - - -] connecting to ceph (timeout=5). _do_conn /var/lib/kolla/venv/local/lib/python2.7/site-packages/cinder/volume/drivers/rbd.py:308
2017-10-23 08:49:07.816 735 ERROR oslo_service.service [req-af1815ba-30f8-4a72-8097-756219536c65 - - - - -] Error starting thread.: IndexError: list index out of range
2017-10-23 08:49:07.816 735 ERROR oslo_service.service Traceback (most recent call last):
2017-10-23 08:49:07.816 735 ERROR oslo_service.service File "/var/lib/kolla/venv/local/lib/python2.7/site-packages/oslo_service/service.py", line 721, in run_service
2017-10-23 08:49:07.816 735 ERROR oslo_service.service service.start()
2017-10-23 08:49:07.816 735 ERROR oslo_service.service File "/var/lib/kolla/venv/local/lib/python2.7/site-packages/cinder/service.py", line 242, in start
2017-10-23 08:49:07.816 735 ERROR oslo_service.service service_id=Service.service_id)
2017-10-23 08:49:07.816 735 ERROR oslo_service.service File "/var/lib/kolla/venv/local/lib/python2.7/site-packages/cinder/volume/manager.py", line 440, in init_host
2017-10-23 08:49:07.816 735 ERROR oslo_service.service self.driver.init_capabilities()
2017-10-23 08:49:07.816 735 ERROR oslo_service.service File "/var/lib/kolla/venv/local/lib/python2.7/site-packages/cinder/volume/driver.py", line 704, in init_capabilities
2017-10-23 08:49:07.816 735 ERROR oslo_service.service stats = self.get_volume_stats(True)
2017-10-23 08:49:07.816 735 ERROR oslo_service.service File "/var/lib/kolla/venv/local/lib/python2.7/site-packages/cinder/volume/drivers/rbd.py", line 483, in get_volume_stats
2017-10-23 08:49:07.816 735 ERROR oslo_service.service self._update_volume_stats()
2017-10-23 08:49:07.816 735 ERROR oslo_service.service File "/var/lib/kolla/venv/local/lib/python2.7/site-packages/cinder/volume/drivers/rbd.py", line 466, i...

Read more...

Revision history for this message
Javier Castillo (javcasalc) wrote :

another strange thing related with this bug:

exec "rados df" from mon/control nodes, shows empty data:

(ceph-mon)[root@control01 /]# rados lspools
.rgw.root
default.rgw.control
default.rgw.meta
default.rgw.log
images
volumes
backups
vms
gnocchi

(ceph-mon)[root@control01 /]# rados df
POOL_NAME USED OBJECTS CLONES COPIES MISSING_ON_PRIMARY UNFOUND DEGRADED RD_OPS RD WR_OPS WR

total_objects 0
total_used 0
total_avail 0
total_space 0

Checking CEPH official docs, it looks like "ceph-mgr" is mandatory in Ceph 12.2 verison: http://docs.ceph.com/docs/master/release-notes/
There is a new daemon, ceph-mgr, which is a required part of any Ceph deployment. Although IO can continue when ceph-mgr is down, metrics will not refresh and some metrics-related calls (e.g., ceph df) may block.

But in my deployment, I don't have any ceph-mgr running, so it makes sense no valid pool data is available to rbd clients (like "rados df" or cinder-volume)

Revision history for this message
Launchpad Janitor (janitor) wrote :

[Expired for kolla-ansible because there has been no activity for 60 days.]

Changed in kolla-ansible:
status: Incomplete → Expired
To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.