can not create instance when using vmware nova driver

Bug #1744182 reported by lws
8
This bug affects 1 person
Affects Status Importance Assigned to Milestone
OpenStack Compute (nova)
Incomplete
Low
Radoslav Gerganov

Bug Description

Hi,everybody.

Environment
===============
OpenStack: pike version install with kolla-ansible
OS:Centos7.4

Logs
==============
I am getting the follow error when I try to create a instance from vmdk.

2018-01-18 06:40:01.045 7 DEBUG oslo_concurrency.lockutils [req-bc40738a-a3ee-4d9c-bd67-32e6fb32df08 32e0ed602bc549f48f7caf401420b628 7179dd1be7ef4cf2906b41b97970a0f6 - default default] Releasing semaphore "refresh_cache-b4b7cabe-f78b-40d9-8856-3b6c213efd73" lock /var/lib/kolla/venv/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:225
2018-01-18 06:40:01.046 7 DEBUG nova.compute.manager [req-bc40738a-a3ee-4d9c-bd67-32e6fb32df08 32e0ed602bc549f48f7caf401420b628 7179dd1be7ef4cf2906b41b97970a0f6 - default default] [instance: b4b7cabe-f78b-40d9-8856-3b6c213efd73] Instance network_info: |[{"profile": {}, "ovs_interfaceid": "61393d4a-5b1f-4113-ac7b-251dc4afe066", "preserve_on_delete": false, "network": {"bridge": "br-int", "subnets": [{"ips": [{"meta": {}, "version": 4, "type": "fixed", "floating_ips": [], "address": "10.0.0.9"}], "version": 4, "meta": {"dhcp_server": "10.0.0.2"}, "dns": [{"meta": {}, "version": 4, "type": "dns", "address": "8.8.8.8"}], "routes": [], "cidr": "10.0.0.0/24", "gateway": {"meta": {}, "version": 4, "type": "gateway", "address": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "7179dd1be7ef4cf2906b41b97970a0f6", "mtu": 1450}, "id": "afa8c911-a770-4b9d-a8e8-13c65e691d46", "label": "demo-net"}, "devname": "tap61393d4a-5b", "vnic_type": "normal", "qbh_params": null, "meta": {}, "details": {"port_filter": true, "datapath_type": "system", "ovs_hybrid_plug": true}, "address": "fa:16:3e:b5:bd:28", "active": false, "type": "ovs", "id": "61393d4a-5b1f-4113-ac7b-251dc4afe066", "qbg_params": null}]| _allocate_network_async /var/lib/kolla/venv/lib/python2.7/site-packages/nova/compute/manager.py:1387
2018-01-18 06:40:01.075 7 DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-294a05f6-f165-425d-bdce-df36982ae5b3 request_handler /var/lib/kolla/venv/lib/python2.7/site-packages/oslo_vmware/service.py:354
2018-01-18 06:40:01.090 7 DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5435f67-2b69-44c9-996d-1550f6139c9d request_handler /var/lib/kolla/venv/lib/python2.7/site-packages/oslo_vmware/service.py:354
2018-01-18 06:40:01.105 7 DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4d7bf28-4877-40fc-a425-2f520501e3f8 request_handler /var/lib/kolla/venv/lib/python2.7/site-packages/oslo_vmware/service.py:354
2018-01-18 06:40:01.118 7 DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c210dfa-c695-4a88-bfc9-140ae913c53e request_handler /var/lib/kolla/venv/lib/python2.7/site-packages/oslo_vmware/service.py:354
2018-01-18 06:40:01.132 7 DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04a1a599-5fe5-41d1-838f-5a50a76cf9c2 request_handler /var/lib/kolla/venv/lib/python2.7/site-packages/oslo_vmware/service.py:354
2018-01-18 06:40:01.143 7 DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6681887f-3579-40ec-b7c3-0bd5d1f51fb2 request_handler /var/lib/kolla/venv/lib/python2.7/site-packages/oslo_vmware/service.py:354
2018-01-18 06:40:01.154 7 DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-72e556d4-863d-4f7e-9e91-bb3396f86fb2 request_handler /var/lib/kolla/venv/lib/python2.7/site-packages/oslo_vmware/service.py:354
2018-01-18 06:40:01.169 7 DEBUG nova.virt.vmwareapi.vm_util [req-bc40738a-a3ee-4d9c-bd67-32e6fb32df08 32e0ed602bc549f48f7caf401420b628 7179dd1be7ef4cf2906b41b97970a0f6 - default default] [instance: b4b7cabe-f78b-40d9-8856-3b6c213efd73] Creating VM on the ESX host create_vm /var/lib/kolla/venv/lib/python2.7/site-packages/nova/virt/vmwareapi/vm_util.py:1312
2018-01-18 06:40:01.170 7 DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-d28e35a6-3139-4592-8b9c-b9f8772f164e request_handler /var/lib/kolla/venv/lib/python2.7/site-packages/oslo_vmware/service.py:354
2018-01-18 06:40:01.209 7 DEBUG oslo_vmware.api [req-bc40738a-a3ee-4d9c-bd67-32e6fb32df08 32e0ed602bc549f48f7caf401420b628 7179dd1be7ef4cf2906b41b97970a0f6 - default default] Waiting for the task: (returnval){
   value = "task-137"
   _type = "Task"
 } to complete. wait_for_task /var/lib/kolla/venv/lib/python2.7/site-packages/oslo_vmware/api.py:395
2018-01-18 06:40:01.224 7 DEBUG oslo_vmware.api [-] Task: {'id': task-137, 'name': CreateVM_Task} progress is 0%. _poll_task /var/lib/kolla/venv/lib/python2.7/site-packages/oslo_vmware/api.py:431
2018-01-18 06:40:01.325 7 DEBUG nova.compute.manager [req-2771ee21-f6dc-46c2-9c23-4f64a6e8e4eb bfe8bd0ac2e8471f93e455f10e4f6a61 a423e2518d1e42a5a688b467655795ab - default default] [instance: b4b7cabe-f78b-40d9-8856-3b6c213efd73] Received event network-changed-61393d4a-5b1f-4113-ac7b-251dc4afe066 external_instance_event /var/lib/kolla/venv/lib/python2.7/site-packages/nova/compute/manager.py:6957
2018-01-18 06:40:01.325 7 DEBUG oslo_concurrency.lockutils [req-2771ee21-f6dc-46c2-9c23-4f64a6e8e4eb bfe8bd0ac2e8471f93e455f10e4f6a61 a423e2518d1e42a5a688b467655795ab - default default] Acquired semaphore "refresh_cache-b4b7cabe-f78b-40d9-8856-3b6c213efd73" lock /var/lib/kolla/venv/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:212
2018-01-18 06:40:01.326 7 DEBUG nova.network.neutronv2.api [req-2771ee21-f6dc-46c2-9c23-4f64a6e8e4eb bfe8bd0ac2e8471f93e455f10e4f6a61 a423e2518d1e42a5a688b467655795ab - default default] [instance: b4b7cabe-f78b-40d9-8856-3b6c213efd73] _get_instance_nw_info() _get_instance_nw_info /var/lib/kolla/venv/lib/python2.7/site-packages/nova/network/neutronv2/api.py:1302
2018-01-18 06:40:01.735 7 DEBUG oslo_vmware.exceptions [-] Fault PlatformConfigFault not matched. get_fault_class /var/lib/kolla/venv/lib/python2.7/site-packages/oslo_vmware/exceptions.py:295
2018-01-18 06:40:01.736 7 ERROR oslo_vmware.common.loopingcall [-] in fixed duration looping call: VimFaultException: An error occurred during host configuration.
Faults: ['PlatformConfigFault']
2018-01-18 06:40:01.736 7 ERROR oslo_vmware.common.loopingcall Traceback (most recent call last):
2018-01-18 06:40:01.736 7 ERROR oslo_vmware.common.loopingcall File "/var/lib/kolla/venv/lib/python2.7/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner
2018-01-18 06:40:01.736 7 ERROR oslo_vmware.common.loopingcall self.f(*self.args, **self.kw)
2018-01-18 06:40:01.736 7 ERROR oslo_vmware.common.loopingcall File "/var/lib/kolla/venv/lib/python2.7/site-packages/oslo_vmware/api.py", line 452, in _poll_task
2018-01-18 06:40:01.736 7 ERROR oslo_vmware.common.loopingcall raise task_ex
2018-01-18 06:40:01.736 7 ERROR oslo_vmware.common.loopingcall VimFaultException: An error occurred during host configuration.
2018-01-18 06:40:01.736 7 ERROR oslo_vmware.common.loopingcall Faults: ['PlatformConfigFault']
2018-01-18 06:40:01.736 7 ERROR oslo_vmware.common.loopingcall
2018-01-18 06:40:01.738 7 ERROR nova.compute.manager [req-bc40738a-a3ee-4d9c-bd67-32e6fb32df08 32e0ed602bc549f48f7caf401420b628 7179dd1be7ef4cf2906b41b97970a0f6 - default default] [instance: b4b7cabe-f78b-40d9-8856-3b6c213efd73] Instance failed to spawn: VimFaultException: An error occurred during host configuration.
Faults: ['PlatformConfigFault']
2018-01-18 06:40:01.738 7 ERROR nova.compute.manager [instance: b4b7cabe-f78b-40d9-8856-3b6c213efd73] Traceback (most recent call last):
2018-01-18 06:40:01.738 7 ERROR nova.compute.manager [instance: b4b7cabe-f78b-40d9-8856-3b6c213efd73] File "/var/lib/kolla/venv/lib/python2.7/site-packages/nova/compute/manager.py", line 2162, in _build_resources
2018-01-18 06:40:01.738 7 ERROR nova.compute.manager [instance: b4b7cabe-f78b-40d9-8856-3b6c213efd73] yield resources
2018-01-18 06:40:01.738 7 ERROR nova.compute.manager [instance: b4b7cabe-f78b-40d9-8856-3b6c213efd73] File "/var/lib/kolla/venv/lib/python2.7/site-packages/nova/compute/manager.py", line 1977, in _build_and_run_instance
2018-01-18 06:40:01.738 7 ERROR nova.compute.manager [instance: b4b7cabe-f78b-40d9-8856-3b6c213efd73] block_device_info=block_device_info)
2018-01-18 06:40:01.738 7 ERROR nova.compute.manager [instance: b4b7cabe-f78b-40d9-8856-3b6c213efd73] File "/var/lib/kolla/venv/lib/python2.7/site-packages/nova/virt/vmwareapi/driver.py", line 323, in spawn
2018-01-18 06:40:01.738 7 ERROR nova.compute.manager [instance: b4b7cabe-f78b-40d9-8856-3b6c213efd73] admin_password, network_info, block_device_info)
2018-01-18 06:40:01.738 7 ERROR nova.compute.manager [instance: b4b7cabe-f78b-40d9-8856-3b6c213efd73] File "/var/lib/kolla/venv/lib/python2.7/site-packages/nova/virt/vmwareapi/vmops.py", line 735, in spawn
2018-01-18 06:40:01.738 7 ERROR nova.compute.manager [instance: b4b7cabe-f78b-40d9-8856-3b6c213efd73] metadata)
2018-01-18 06:40:01.738 7 ERROR nova.compute.manager [instance: b4b7cabe-f78b-40d9-8856-3b6c213efd73] File "/var/lib/kolla/venv/lib/python2.7/site-packages/nova/virt/vmwareapi/vmops.py", line 301, in build_virtual_machine
2018-01-18 06:40:01.738 7 ERROR nova.compute.manager [instance: b4b7cabe-f78b-40d9-8856-3b6c213efd73] config_spec, self._root_resource_pool)
2018-01-18 06:40:01.738 7 ERROR nova.compute.manager [instance: b4b7cabe-f78b-40d9-8856-3b6c213efd73] File "/var/lib/kolla/venv/lib/python2.7/site-packages/nova/virt/vmwareapi/vm_util.py", line 1333, in create_vm
2018-01-18 06:40:01.738 7 ERROR nova.compute.manager [instance: b4b7cabe-f78b-40d9-8856-3b6c213efd73] {'ostype': config_spec.guestId})
2018-01-18 06:40:01.738 7 ERROR nova.compute.manager [instance: b4b7cabe-f78b-40d9-8856-3b6c213efd73] File "/var/lib/kolla/venv/lib/python2.7/site-packages/oslo_utils/excutils.py", line 220, in __exit__
2018-01-18 06:40:01.738 7 ERROR nova.compute.manager [instance: b4b7cabe-f78b-40d9-8856-3b6c213efd73] self.force_reraise()
2018-01-18 06:40:01.738 7 ERROR nova.compute.manager [instance: b4b7cabe-f78b-40d9-8856-3b6c213efd73] File "/var/lib/kolla/venv/lib/python2.7/site-packages/oslo_utils/excutils.py", line 196, in force_reraise
2018-01-18 06:40:01.738 7 ERROR nova.compute.manager [instance: b4b7cabe-f78b-40d9-8856-3b6c213efd73] six.reraise(self.type_, self.value, self.tb)
2018-01-18 06:40:01.738 7 ERROR nova.compute.manager [instance: b4b7cabe-f78b-40d9-8856-3b6c213efd73] File "/var/lib/kolla/venv/lib/python2.7/site-packages/nova/virt/vmwareapi/vm_util.py", line 1318, in create_vm
2018-01-18 06:40:01.738 7 ERROR nova.compute.manager [instance: b4b7cabe-f78b-40d9-8856-3b6c213efd73] task_info = session._wait_for_task(vm_create_task)
2018-01-18 06:40:01.738 7 ERROR nova.compute.manager [instance: b4b7cabe-f78b-40d9-8856-3b6c213efd73] File "/var/lib/kolla/venv/lib/python2.7/site-packages/nova/virt/vmwareapi/driver.py", line 545, in _wait_for_task
2018-01-18 06:40:01.738 7 ERROR nova.compute.manager [instance: b4b7cabe-f78b-40d9-8856-3b6c213efd73] return self.wait_for_task(task_ref)
2018-01-18 06:40:01.738 7 ERROR nova.compute.manager [instance: b4b7cabe-f78b-40d9-8856-3b6c213efd73] File "/var/lib/kolla/venv/lib/python2.7/site-packages/oslo_vmware/api.py", line 396, in wait_for_task
2018-01-18 06:40:01.738 7 ERROR nova.compute.manager [instance: b4b7cabe-f78b-40d9-8856-3b6c213efd73] return evt.wait()
2018-01-18 06:40:01.738 7 ERROR nova.compute.manager [instance: b4b7cabe-f78b-40d9-8856-3b6c213efd73] File "/var/lib/kolla/venv/lib/python2.7/site-packages/eventlet/event.py", line 121, in wait
2018-01-18 06:40:01.738 7 ERROR nova.compute.manager [instance: b4b7cabe-f78b-40d9-8856-3b6c213efd73] return hubs.get_hub().switch()
2018-01-18 06:40:01.738 7 ERROR nova.compute.manager [instance: b4b7cabe-f78b-40d9-8856-3b6c213efd73] File "/var/lib/kolla/venv/lib/python2.7/site-packages/eventlet/hubs/hub.py", line 294, in switch
2018-01-18 06:40:01.738 7 ERROR nova.compute.manager [instance: b4b7cabe-f78b-40d9-8856-3b6c213efd73] return self.greenlet.switch()
2018-01-18 06:40:01.738 7 ERROR nova.compute.manager [instance: b4b7cabe-f78b-40d9-8856-3b6c213efd73] File "/var/lib/kolla/venv/lib/python2.7/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner
2018-01-18 06:40:01.738 7 ERROR nova.compute.manager [instance: b4b7cabe-f78b-40d9-8856-3b6c213efd73] self.f(*self.args, **self.kw)
2018-01-18 06:40:01.738 7 ERROR nova.compute.manager [instance: b4b7cabe-f78b-40d9-8856-3b6c213efd73] File "/var/lib/kolla/venv/lib/python2.7/site-packages/oslo_vmware/api.py", line 452, in _poll_task
2018-01-18 06:40:01.738 7 ERROR nova.compute.manager [instance: b4b7cabe-f78b-40d9-8856-3b6c213efd73] raise task_ex
2018-01-18 06:40:01.738 7 ERROR nova.compute.manager [instance: b4b7cabe-f78b-40d9-8856-3b6c213efd73] VimFaultException: An error occurred during host configuration.
2018-01-18 06:40:01.738 7 ERROR nova.compute.manager [instance: b4b7cabe-f78b-40d9-8856-3b6c213efd73] Faults: ['PlatformConfigFault']
2018-01-18 06:40:01.738 7 ERROR nova.compute.manager [instance: b4b7cabe-f78b-40d9-8856-3b6c213efd73]
2018-01-18 06:40:01.743 7 INFO nova.compute.manager [req-bc40738a-a3ee-4d9c-bd67-32e6fb32df08 32e0ed602bc549f48f7caf401420b628 7179dd1be7ef4cf2906b41b97970a0f6 - default default] [instance: b4b7cabe-f78b-40d9-8856-3b6c213efd73] Terminating instance
2018-01-18 06:40:01.745 7 DEBUG nova.compute.manager [req-bc40738a-a3ee-4d9c-bd67-32e6fb32df08 32e0ed602bc549f48f7caf401420b628 7179dd1be7ef4cf2906b41b97970a0f6 - default default] [instance: b4b7cabe-f78b-40d9-8856-3b6c213efd73] Start destroying the instance on the hypervisor. _shutdown_instance /var/lib/kolla/venv/lib/python2.7/site-packages/nova/compute/manager.py:2276
2018-01-18 06:40:01.746 7 DEBUG nova.virt.vmwareapi.vmops [req-bc40738a-a3ee-4d9c-bd67-32e6fb32df08 32e0ed602bc549f48f7caf401420b628 7179dd1be7ef4cf2906b41b97970a0f6 - default default] [instance: b4b7cabe-f78b-40d9-8856-3b6c213efd73] Destroying instance destroy /var/lib/kolla/venv/lib/python2.7/site-packages/nova/virt/vmwareapi/vmops.py:1122
2018-01-18 06:40:01.746 7 DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-1ecff06e-c407-4849-830a-0fe2a6426ba8 request_handler /var/lib/kolla/venv/lib/python2.7/site-packages/oslo_vmware/service.py:354
2018-01-18 06:40:01.762 7 DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62d74ff8-a07c-433d-8668-1d2df52e502f request_handler /var/lib/kolla/venv/lib/python2.7/site-packages/oslo_vmware/service.py:354
2018-01-18 06:40:01.789 7 DEBUG oslo_vmware.service [-] RetrievePropertiesEx API response is empty; setting fault to NotAuthenticated. _retrieve_properties_ex_fault_checker /var/lib/kolla/venv/lib/python2.7/site-packages/oslo_vmware/service.py:267
2018-01-18 06:40:01.790 7 DEBUG oslo_vmware.api [-] Checking if the current session: ee9b8 is active. is_current_session_active /var/lib/kolla/venv/lib/python2.7/site-packages/oslo_vmware/api.py:365
2018-01-18 06:40:01.790 7 DEBUG oslo_vmware.service [-] Invoking SessionManager.SessionIsActive with opID=oslo.vmware-54dd44fd-f3ff-4797-bcd0-b25ba4e70f79 request_handler /var/lib/kolla/venv/lib/python2.7/site-packages/oslo_vmware/service.py:354
2018-01-18 06:40:01.796 7 DEBUG oslo_vmware.api [-] Returning empty response for <module 'nova.virt.vmwareapi.vim_util' from '/var/lib/kolla/venv/lib/python2.7/site-packages/nova/virt/vmwareapi/vim_util.pyc'>.get_objects invocation. _invoke_api /var/lib/kolla/venv/lib/python2.7/site-packages/oslo_vmware/api.py:316
2018-01-18 06:40:01.803 7 DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94c941a0-a3f6-4c54-a814-c4402d642c25 request_handler /var/lib/kolla/venv/lib/python2.7/site-packages/oslo_vmware/service.py:354
2018-01-18 06:40:01.830 7 DEBUG oslo_vmware.service [-] RetrievePropertiesEx API response is empty; setting fault to NotAuthenticated. _retrieve_properties_ex_fault_checker /var/lib/kolla/venv/lib/python2.7/site-packages/oslo_vmware/service.py:267
2018-01-18 06:40:01.830 7 DEBUG oslo_vmware.api [-] Checking if the current session: ee9b8 is active. is_current_session_active /var/lib/kolla/venv/lib/python2.7/site-packages/oslo_vmware/api.py:365
2018-01-18 06:40:01.830 7 DEBUG oslo_vmware.service [-] Invoking SessionManager.SessionIsActive with opID=oslo.vmware-8b5529ca-af1e-44f2-bc13-522bf9219a8e request_handler /var/lib/kolla/venv/lib/python2.7/site-packages/oslo_vmware/service.py:354
2018-01-18 06:40:01.836 7 DEBUG oslo_vmware.api [-] Returning empty response for <module 'nova.virt.vmwareapi.vim_util' from '/var/lib/kolla/venv/lib/python2.7/site-packages/nova/virt/vmwareapi/vim_util.pyc'>.get_objects invocation. _invoke_api /var/lib/kolla/venv/lib/python2.7/site-packages/oslo_vmware/api.py:316
2018-01-18 06:40:01.842 7 DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-104759f0-5e82-4a47-b3aa-67f5d3cf3bc3 request_handler /var/lib/kolla/venv/lib/python2.7/site-packages/oslo_vmware/service.py:354
2018-01-18 06:40:01.867 7 DEBUG oslo_vmware.service [-] RetrievePropertiesEx API response is empty; setting fault to NotAuthenticated. _retrieve_properties_ex_fault_checker /var/lib/kolla/venv/lib/python2.7/site-packages/oslo_vmware/service.py:267
2018-01-18 06:40:01.867 7 DEBUG oslo_vmware.api [-] Checking if the current session: ee9b8 is active. is_current_session_active /var/lib/kolla/venv/lib/python2.7/site-packages/oslo_vmware/api.py:365
2018-01-18 06:40:01.868 7 DEBUG oslo_vmware.service [-] Invoking SessionManager.SessionIsActive with opID=oslo.vmware-d71bae51-04a0-4b88-be71-702fbdaec9cf request_handler /var/lib/kolla/venv/lib/python2.7/site-packages/oslo_vmware/service.py:354
2018-01-18 06:40:01.873 7 DEBUG oslo_vmware.api [-] Returning empty response for <module 'nova.virt.vmwareapi.vim_util' from '/var/lib/kolla/venv/lib/python2.7/site-packages/nova/virt/vmwareapi/vim_util.pyc'>.get_objects invocation. _invoke_api /var/lib/kolla/venv/lib/python2.7/site-packages/oslo_vmware/api.py:316
2018-01-18 06:40:01.874 7 WARNING nova.virt.vmwareapi.vmops [req-bc40738a-a3ee-4d9c-bd67-32e6fb32df08 32e0ed602bc549f48f7caf401420b628 7179dd1be7ef4cf2906b41b97970a0f6 - default default] [instance: b4b7cabe-f78b-40d9-8856-3b6c213efd73] Instance does not exist on backend: InstanceNotFound: Instance b4b7cabe-f78b-40d9-8856-3b6c213efd73 could not be found.
2018-01-18 06:40:01.874 7 DEBUG nova.virt.vmwareapi.vmops [req-bc40738a-a3ee-4d9c-bd67-32e6fb32df08 32e0ed602bc549f48f7caf401420b628 7179dd1be7ef4cf2906b41b97970a0f6 - default default] [instance: b4b7cabe-f78b-40d9-8856-3b6c213efd73] Instance destroyed destroy /var/lib/kolla/venv/lib/python2.7/site-packages/nova/virt/vmwareapi/vmops.py:1124
2018-01-18 06:40:01.874 7 INFO nova.compute.manager [req-bc40738a-a3ee-4d9c-bd67-32e6fb32df08 32e0ed602bc549f48f7caf401420b628 7179dd1be7ef4cf2906b41b97970a0f6 - default default] [instance: b4b7cabe-f78b-40d9-8856-3b6c213efd73] Took 0.13 seconds to destroy the instance on the hypervisor.
2018-01-18 06:40:01.875 7 DEBUG nova.compute.claims [req-bc40738a-a3ee-4d9c-bd67-32e6fb32df08 32e0ed602bc549f48f7caf401420b628 7179dd1be7ef4cf2906b41b97970a0f6 - default default] [instance: b4b7cabe-f78b-40d9-8856-3b6c213efd73] Aborting claim: [Claim: 512 MB memory, 1 GB disk] abort /var/lib/kolla/venv/lib/python2.7/site-packages/nova/compute/claims.py:123
2018-01-18 06:40:01.876 7 DEBUG oslo_concurrency.lockutils [req-bc40738a-a3ee-4d9c-bd67-32e6fb32df08 32e0ed602bc549f48f7caf401420b628 7179dd1be7ef4cf2906b41b97970a0f6 - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.abort_instance_claim" :: waited 0.000s inner /var/lib/kolla/venv/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:270
2018-01-18 06:40:01.876 7 DEBUG nova.compute.resource_tracker [req-bc40738a-a3ee-4d9c-bd67-32e6fb32df08 32e0ed602bc549f48f7caf401420b628 7179dd1be7ef4cf2906b41b97970a0f6 - default default] We're on a Pike compute host in a deployment with all Pike compute hosts. Skipping auto-correction of allocations. _update_usage_from_instance /var/lib/kolla/venv/lib/python2.7/site-packages/nova/compute/resource_tracker.py:1071

Configs
==============
here is my nova.conf

[DEFAULT]
use_neutron=True
vif_plugging_timeout = 300
vif_plugging_is_fatal = True
debug = True
log_dir = /var/log/kolla/nova
state_path = /var/lib/nova
osapi_compute_listen = 10.200.110.11
osapi_compute_listen_port = 8774
osapi_compute_workers = 5
metadata_workers = 5
metadata_listen = 10.200.110.11
metadata_listen_port = 8775
firewall_driver = nova.virt.firewall.NoopFirewallDriver
allow_resize_to_same_host = true
compute_driver = vmwareapi.VMwareVCDriver
my_ip = 10.200.110.11
transport_url = rabbit://openstack:jYEGHFwtk3abMapZgLxjTenLICYwttgkhi15t0Cy@10.200.110.11:5672

[api]
use_forwarded_for = true

[conductor]
workers = 5

[vnc]
novncproxy_host = 10.200.110.11
novncproxy_port = 6080
vncserver_listen = 10.200.110.11
vncserver_proxyclient_address = 10.200.110.11
novncproxy_base_url = http://10.200.110.83:6080/vnc_auto.html

[oslo_concurrency]
lock_path = /var/lib/nova/tmp

[glance]
api_servers = http://10.200.110.83:9292
num_retries = 1
debug = False

[neutron]
url = http://10.200.110.83:9696
metadata_proxy_shared_secret = s4xoCyOxqYaAtfqk9Sr5WCYq54pagkDvzI6wsMp7
service_metadata_proxy = true
auth_url = http://10.200.110.83:35357
auth_type = password
project_domain_name = Default
user_domain_id = default
project_name = service
username = neutron
password = cIDr9b8tIHYVh4eq8veAC8x9BJJODregeTFDgYhC

[database]
connection = mysql+pymysql://nova:PJ4j2T2vU81TUcy2wnga3ygryqmfkY8WGnzRFD5u@10.200.110.83:3306/nova
max_pool_size = 50
max_overflow = 1000
max_retries = -1

[api_database]
connection = mysql+pymysql://nova_api:2QMPJ5muJvjQSeeidV0kyT6WkocCW0gMIsVWJC0N@10.200.110.83:3306/nova_api
max_retries = -1

[cache]
backend = oslo_cache.memcache_pool
enabled = True
memcache_servers = 10.200.110.11:11211

[keystone_authtoken]
auth_uri = http://10.200.110.83:5000
auth_url = http://10.200.110.83:35357
auth_type = password
project_domain_id = default
user_domain_id = default
project_name = service
username = nova
password = W3Grrkic8Cdq3EzdyMNeFGBdRvbD7u1rgvGpxadZ
memcache_security_strategy = ENCRYPT
memcache_secret_key = OeCvpVyzCumdTBzGKdyFvUGvCsVqwhykUs0LpQuM
memcached_servers = 10.200.110.11:11211

[libvirt]
connection_uri = qemu+tcp://10.200.110.11/system
virt_type = kvm

[upgrade_levels]
compute = auto

[oslo_messaging_notifications]
driver = noop

[privsep_entrypoint]
helper_command = sudo nova-rootwrap /etc/nova/rootwrap.conf privsep-helper --config-file /etc/nova/nova.conf

[guestfs]
debug = False

[wsgi]
api_paste_config = /etc/nova/api-paste.ini

[scheduler]
max_attempts = 10
discover_hosts_in_cells_interval = 60

[placement]
auth_type = password
auth_url = http://10.200.110.83:35357
username = placement
password = 531RY43WdkEkYWOnJfbctdd5GCKIZDaLqkilCdjD
user_domain_name = Default
project_name = service
project_domain_name = Default
os_region_name = RegionOne
os_interface = internal

[notifications]

[vmware]
host_ip = 10.200.110.73
host_username = <email address hidden>
host_password = openstack@
cluster_name = stack
api_retry_count= 10
vlan_interface = vmnic0
integration_bridge = br-int
wsdl_locations = https://vcenter.vsphere.local/sdk/vimService.wsdl
datastore_regex = datastore1
insecure = True

Tags: vmware
Revision history for this message
lws (openstack1) wrote :

This error occurred when I created the virtual machine.Can someone help me?

Revision history for this message
Matt Riedemann (mriedem) wrote :

2018-01-18 06:40:01.738 7 ERROR nova.compute.manager [instance: b4b7cabe-f78b-40d9-8856-3b6c213efd73] VimFaultException: An error occurred during host configuration.
2018-01-18 06:40:01.738 7 ERROR nova.compute.manager [instance: b4b7cabe-f78b-40d9-8856-3b6c213efd73] Faults: ['PlatformConfigFault']

I'm not sure what that means.

tags: added: vmware
Revision history for this message
Giridhar Jayavelu (gjayavelu) wrote :

Can you please upload log /var/log/vmware/vpxd/vpxd.log from vcenter (10.200.110.73)?

Revision history for this message
Sylvain Bauza (sylvain-bauza) wrote :

Setting to Incomplete until the reporter provides the asked log.
Reporter, please set the status back to 'New' once you provide it.

Changed in nova:
status: New → Incomplete
Revision history for this message
lws (openstack1) wrote :
Download full text (112.2 KiB)

Sorry. Too late.
There is /var/log/vmware/vpxd/vpxd.log for vcenter (10.200.110.73)
2018-02-23T02:03:20.994Z info vpxd[7FFA26997700] [Originator@6876 sub=vpxLro opID=1d797aaa] [VpxLRO] -- FINISH lro-697
2018-02-23T02:03:20.994Z info vpxd[7FFA26997700] [Originator@6876 sub=Default opID=1d797aaa] [VpxLRO] -- ERROR lro-697 -- CustomFieldsManager -- vim.CustomFieldsManager.addFieldDefinition: vim.fault.DuplicateName:
--> Result:
--> (vim.fault.DuplicateName) {
--> faultCause = (vmodl.MethodFault) null,
--> faultMessage = <unset>,
--> name = "com.vmware.vsan.clustermembers24",
--> object = 'vim.CustomFieldsManager:CustomFieldsManager'
--> msg = ""
--> }
--> Args:
-->
--> Arg name:
--> "com.vmware.vsan.clustermembers24"
--> Arg moType:
--> "vim.ClusterComputeResource"
--> Arg fieldDefPolicy:
-->
--> Arg fieldPolicy:
-->
2018-02-23T02:03:20.996Z info vpxd[7FFA26997700] [Originator@6876 sub=vpxLro opID=12643e04] [VpxLRO] -- BEGIN lro-698 -- CustomFieldsManager -- vim.CustomFieldsManager.addFieldDefinition -- 52f42b19-fdbd-73e5-5e78-746ba8302c8e(52e26d8a-5738-7648-e9e0-6a3d8ea050a3)
2018-02-23T02:03:20.996Z info vpxd[7FFA26997700] [Originator@6876 sub=vpxLro opID=12643e04] [VpxLRO] -- FINISH lro-698
2018-02-23T02:03:20.996Z info vpxd[7FFA26997700] [Originator@6876 sub=Default opID=12643e04] [VpxLRO] -- ERROR lro-698 -- CustomFieldsManager -- vim.CustomFieldsManager.addFieldDefinition: vim.fault.DuplicateName:
--> Result:
--> (vim.fault.DuplicateName) {
--> faultCause = (vmodl.MethodFault) null,
--> faultMessage = <unset>,
--> name = "com.vmware.vsan.clustermembers25",
--> object = 'vim.CustomFieldsManager:CustomFieldsManager'
--> msg = ""
--> }
--> Args:
-->
--> Arg name:
--> "com.vmware.vsan.clustermembers25"
--> Arg moType:
--> "vim.ClusterComputeResource"
--> Arg fieldDefPolicy:
-->
--> Arg fieldPolicy:
-->
2018-02-23T02:03:21.012Z info vpxd[7FFA242CA700] [Originator@6876 sub=vpxLro opID=7323820a] [VpxLRO] -- BEGIN lro-699 -- CustomFieldsManager -- vim.CustomFieldsManager.addFieldDefinition -- 52f42b19-fdbd-73e5-5e78-746ba8302c8e(52e26d8a-5738-7648-e9e0-6a3d8ea050a3)
2018-02-23T02:03:21.012Z info vpxd[7FFA242CA700] [Originator@6876 sub=vpxLro opID=7323820a] [VpxLRO] -- FINISH lro-699
2018-02-23T02:03:21.012Z info vpxd[7FFA242CA700] [Originator@6876 sub=Default opID=7323820a] [VpxLRO] -- ERROR lro-699 -- CustomFieldsManager -- vim.CustomFieldsManager.addFieldDefinition: vim.fault.DuplicateName:
--> Result:
--> (vim.fault.DuplicateName) {
--> faultCause = (vmodl.MethodFault) null,
--> faultMessage = <unset>,
--> name = "com.vmware.vsan.clustermembers26",
--> object = 'vim.CustomFieldsManager:CustomFieldsManager'
--> msg = ""
--> }
--> Args:
-->
--> Arg name:
--> "com.vmware.vsan.clustermembers26"
--> Arg moType:
--> "vim.ClusterComputeResource"
--> Arg fieldDefPolicy:
-->
--> Arg fieldPolicy:
-->
2018-02-23T02:03:21.013Z info vpxd[7FFA242CA700] [Originator@6876 sub=vpxLro opID=202e9994] [VpxLRO] -- BEGIN lro-700 -- CustomFieldsManager -- vim.CustomFieldsManager.addFieldDefinition -- 52f42b19-fdbd-73e5-5e78-746ba8302c8e(52e26d8a-5738-7648-e9e0-6a3d8ea050a3)
2018-02-23T02:03:21.013Z info vp...

Changed in nova:
status: Incomplete → New
Revision history for this message
Radoslav Gerganov (rgerganov) wrote :

The vpxd log is from 2018-02-23 but the Nova failure occurred on 2018-01-18, so this log is not useful.

You can correlate log messages from nova-compute to log messages in vpxd.log using the opID. In your case the CreateVM_Task had opID=oslo.vmware-d28e35a6-3139-4592-8b9c-b9f8772f164e:

2018-01-18 06:40:01.170 7 DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-d28e35a6-3139-4592-8b9c-b9f8772f164e request_handler /var/lib/kolla/venv/lib/python2.7/site-packages/oslo_vmware/service.py:354

Grep vpxd.log for this opID and paste it here or reproduce the problem again and do the same.

Changed in nova:
assignee: nobody → Radoslav Gerganov (rgerganov)
importance: Undecided → Low
Revision history for this message
melanie witt (melwitt) wrote :

I'm going to set this to Incomplete since Rado has asked for more information from the bug reporter to investigate this further.

Changed in nova:
status: New → Incomplete
To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.