Aug 30 14:01:01.547224 np0035104604 systemd[1]: Started Devstack devstack@n-cpu.service. Aug 30 14:01:02.293823 np0035104604 nova-compute[107505]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. Aug 30 14:01:04.817444 np0035104604 nova-compute[107505]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=107505) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} Aug 30 14:01:04.817978 np0035104604 nova-compute[107505]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=107505) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} Aug 30 14:01:04.818542 np0035104604 nova-compute[107505]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=107505) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} Aug 30 14:01:04.819187 np0035104604 nova-compute[107505]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs Aug 30 14:01:04.995011 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:01:05.009951 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.015s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:01:05.662386 np0035104604 nova-compute[107505]: INFO nova.virt.driver [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] Loading compute driver 'libvirt.LibvirtDriver' Aug 30 14:01:05.850405 np0035104604 nova-compute[107505]: INFO nova.compute.provider_config [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. Aug 30 14:01:05.883902 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] Acquiring lock "singleton_lock" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:01:05.884143 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] Acquired lock "singleton_lock" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:01:05.884620 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] Releasing lock "singleton_lock" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:01:05.885250 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] Full set of CONF: {{(pid=107505) _wait_for_exit_or_signal /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/service.py:362}} Aug 30 14:01:05.885512 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] ******************************************************************************** {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2589}} Aug 30 14:01:05.885813 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] Configuration options gathered from: {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2590}} Aug 30 14:01:05.886517 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] command line args: ['--config-file', '/etc/nova/nova-cpu.conf'] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2591}} Aug 30 14:01:05.886853 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] config files: ['/etc/nova/nova-cpu.conf'] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2592}} Aug 30 14:01:05.887130 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] ================================================================================ {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2594}} Aug 30 14:01:05.887628 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] allow_resize_to_same_host = True {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.887907 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] arq_binding_timeout = 300 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.888123 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] backdoor_port = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.888334 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] backdoor_socket = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.888992 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] block_device_allocate_retries = 60 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.889183 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] block_device_allocate_retries_interval = 3 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.889491 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cert = self.pem {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.889804 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] compute_driver = libvirt.LibvirtDriver {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.890315 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] compute_monitors = [] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.890425 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] config_dir = [] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.890635 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] config_drive_format = iso9660 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.890865 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] config_file = ['/etc/nova/nova-cpu.conf'] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.891165 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] config_source = [] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.891558 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] console_host = np0035104604 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.892026 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] control_exchange = nova {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.892321 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cpu_allocation_ratio = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.892697 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] daemon = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.893012 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] debug = True {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.893287 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] default_access_ip_network_name = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.893579 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] default_availability_zone = nova {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.894026 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] default_ephemeral_format = ext4 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.894332 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] default_green_pool_size = 1000 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.894803 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.895122 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] default_schedule_zone = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.896312 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] disk_allocation_ratio = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.896312 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] enable_new_services = True {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.896312 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] enabled_apis = ['osapi_compute'] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.896652 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] enabled_ssl_apis = [] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.897061 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] flat_injected = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.897581 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] force_config_drive = True {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.898070 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] force_raw_images = True {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.898519 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] graceful_shutdown_timeout = 5 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.898954 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] heal_instance_info_cache_interval = 60 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.899467 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] host = np0035104604 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.899979 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] initial_cpu_allocation_ratio = 4.0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.900343 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] initial_disk_allocation_ratio = 1.0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.900823 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] initial_ram_allocation_ratio = 1.0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.901263 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.901578 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] instance_build_timeout = 0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.902028 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] instance_delete_interval = 300 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.902479 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] instance_format = [instance: %(uuid)s] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.902950 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] instance_name_template = instance-%08x {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.903433 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] instance_usage_audit = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.903907 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] instance_usage_audit_period = month {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.904377 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.905018 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] instances_path = /opt/stack/data/nova/instances {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.905396 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] internal_service_availability_zone = internal {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.905914 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] key = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.906374 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] live_migration_retry_count = 30 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.906851 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] log_config_append = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.908547 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.908984 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] log_dir = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.909390 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] log_file = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.910308 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] log_options = True {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.910308 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] log_rotate_interval = 1 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.910561 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] log_rotate_interval_type = days {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.910837 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] log_rotation_type = none {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.911056 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.911399 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.912375 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.912375 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.912375 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.912702 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] long_rpc_timeout = 1800 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.913015 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] max_concurrent_builds = 10 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.913395 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] max_concurrent_live_migrations = 1 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.913658 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] max_concurrent_snapshots = 5 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.913954 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] max_local_block_devices = 3 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.914202 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] max_logfile_count = 30 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.914570 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] max_logfile_size_mb = 200 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.914972 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] maximum_instance_delete_attempts = 5 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.915275 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] metadata_listen = 0.0.0.0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.915545 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] metadata_listen_port = 8775 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.915888 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] metadata_workers = 2 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.916139 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] migrate_max_retries = -1 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.916452 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] mkisofs_cmd = genisoimage {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.917497 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] my_block_storage_ip = 149.202.177.86 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.917497 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] my_ip = 149.202.177.86 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.917665 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] network_allocate_retries = 0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.917931 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.918196 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] osapi_compute_listen = 0.0.0.0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.918497 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] osapi_compute_listen_port = 8774 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.919300 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] osapi_compute_unique_server_name_scope = {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.919300 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] osapi_compute_workers = 2 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.919472 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] password_length = 12 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.919723 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] periodic_enable = True {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.920006 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] periodic_fuzzy_delay = 60 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.920390 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] pointer_model = ps2mouse {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.920857 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] preallocate_images = none {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.921242 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] publish_errors = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.921627 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] pybasedir = /opt/stack/nova {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.922180 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] ram_allocation_ratio = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.922574 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] rate_limit_burst = 0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.923266 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] rate_limit_except_level = CRITICAL {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.923428 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] rate_limit_interval = 0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.923744 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] reboot_timeout = 0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.923996 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] reclaim_instance_interval = 0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.924340 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] record = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.924728 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] reimage_timeout_per_gb = 60 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.925123 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] report_interval = 120 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.925469 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] rescue_timeout = 0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.926055 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] reserved_host_cpus = 0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.926316 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] reserved_host_disk_mb = 0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.926693 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] reserved_host_memory_mb = 512 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.926995 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] reserved_huge_pages = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.927264 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] resize_confirm_window = 0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.927552 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] resize_fs_using_block_device = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.927808 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] resume_guests_state_on_host_boot = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.928186 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.928563 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] rpc_response_timeout = 60 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.935296 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] run_external_periodic_tasks = True {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.935296 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] running_deleted_instance_action = reap {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.937087 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] running_deleted_instance_poll_interval = 1800 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.937087 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] running_deleted_instance_timeout = 0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.937087 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] scheduler_instance_sync_interval = 120 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.937452 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] service_down_time = 720 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.937542 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] servicegroup_driver = db {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.937943 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] shelved_offload_time = 0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.938455 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] shelved_poll_interval = 3600 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.938720 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] shutdown_timeout = 0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.940511 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] source_is_ipv6 = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.940511 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] ssl_only = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.940511 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] state_path = /opt/stack/data/nova {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.940511 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] sync_power_state_interval = 600 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.940511 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] sync_power_state_pool_size = 1000 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.940511 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] syslog_log_facility = LOG_USER {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.942733 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] tempdir = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.942733 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] timeout_nbd = 10 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.942733 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] transport_url = **** {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.942733 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] update_resources_interval = 0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.942733 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] use_cow_images = True {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.942733 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] use_eventlog = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.943219 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] use_journal = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.943291 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] use_json = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.943649 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] use_rootwrap_daemon = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.944054 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] use_stderr = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.944492 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] use_syslog = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.944834 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vcpu_pin_set = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.945554 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vif_plugging_is_fatal = True {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.945692 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vif_plugging_timeout = 300 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.946133 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] virt_mkfs = [] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.946509 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] volume_usage_poll_interval = 0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.946887 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] watch_log_file = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.947364 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] web = /usr/share/spice-html5 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} Aug 30 14:01:05.947812 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_concurrency.disable_process_locking = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.948136 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_concurrency.lock_path = /opt/stack/data/nova {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.948512 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.948919 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.949238 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_messaging_metrics.metrics_process_name = {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.949532 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.951511 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.951511 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] api.auth_strategy = keystone {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.951511 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] api.compute_link_prefix = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.951511 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.951511 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] api.dhcp_domain = novalocal {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.951511 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] api.enable_instance_password = True {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.952114 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] api.glance_link_prefix = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.952114 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.952352 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] api.instance_list_cells_batch_strategy = distributed {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.952750 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] api.instance_list_per_project_cells = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.953042 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] api.list_records_by_skipping_down_cells = True {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.953303 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] api.local_metadata_per_cell = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.953556 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] api.max_limit = 1000 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.953974 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] api.metadata_cache_expiration = 15 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.954246 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] api.neutron_default_tenant_id = default {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.954504 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] api.use_forwarded_for = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.954801 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] api.use_neutron_default_nets = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.955161 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.955833 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] api.vendordata_dynamic_failure_fatal = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.955833 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.956041 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] api.vendordata_dynamic_ssl_certfile = {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.956294 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] api.vendordata_dynamic_targets = [] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.956632 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] api.vendordata_jsonfile_path = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.956981 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] api.vendordata_providers = ['StaticJSON'] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.957443 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cache.backend = dogpile.cache.memcached {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.957793 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cache.backend_argument = **** {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.958093 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cache.config_prefix = cache.oslo {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.958441 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cache.dead_timeout = 60.0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.958726 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cache.debug_cache_backend = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.958980 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cache.enable_retry_client = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.959251 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cache.enable_socket_keepalive = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.959503 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cache.enabled = True {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.959751 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cache.expiration_time = 600 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.959993 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cache.hashclient_retry_attempts = 2 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.960252 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cache.hashclient_retry_delay = 1.0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.960565 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cache.memcache_dead_retry = 300 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.960955 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cache.memcache_password = {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.961369 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.961652 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.961959 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cache.memcache_pool_maxsize = 10 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.962217 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cache.memcache_pool_unused_timeout = 60 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.962630 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cache.memcache_sasl_enabled = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.962861 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cache.memcache_servers = ['localhost:11211'] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.963995 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cache.memcache_socket_timeout = 1.0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.963995 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cache.memcache_username = {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.963995 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cache.proxies = [] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.964213 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cache.retry_attempts = 2 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.964641 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cache.retry_delay = 0.0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.964921 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cache.socket_keepalive_count = 1 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.965174 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cache.socket_keepalive_idle = 1 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.965956 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cache.socket_keepalive_interval = 1 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.965956 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cache.tls_allowed_ciphers = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.966260 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cache.tls_cafile = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.966626 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cache.tls_certfile = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.967009 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cache.tls_enabled = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.967392 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cache.tls_keyfile = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.967891 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cinder.auth_section = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.968304 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cinder.auth_type = password {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.968746 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cinder.cafile = /opt/stack/data/ca-bundle.pem {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.969148 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cinder.catalog_info = volumev3::publicURL {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.969498 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cinder.certfile = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.969890 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cinder.collect_timing = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.970259 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cinder.cross_az_attach = True {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.970641 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cinder.debug = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.971032 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cinder.endpoint_template = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.971386 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cinder.http_retries = 3 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.971756 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cinder.insecure = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.972531 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cinder.keyfile = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.972904 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cinder.os_region_name = RegionOne {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.973272 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cinder.split_loggers = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.973645 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cinder.timeout = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.974051 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.974467 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] compute.cpu_dedicated_set = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.975473 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] compute.cpu_shared_set = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.975473 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] compute.image_type_exclude_list = [] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.975473 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] compute.live_migration_wait_for_vif_plug = True {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.976185 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] compute.max_concurrent_disk_ops = 0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.976185 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] compute.max_disk_devices_to_attach = -1 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.976185 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.976401 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.976625 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] compute.resource_provider_association_refresh = 300 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.976877 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] compute.shutdown_retry_interval = 10 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.977268 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.977408 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] conductor.workers = 2 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.977682 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] console.allowed_origins = [] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.977989 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] console.ssl_ciphers = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.978243 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] console.ssl_minimum_version = default {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.978568 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] consoleauth.token_ttl = 600 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.978938 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cyborg.cafile = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.979387 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cyborg.certfile = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.979974 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cyborg.collect_timing = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.980149 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cyborg.connect_retries = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.980565 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cyborg.connect_retry_delay = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.980962 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cyborg.endpoint_override = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.981372 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cyborg.insecure = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.981783 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cyborg.keyfile = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.982190 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cyborg.max_version = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.982569 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cyborg.min_version = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.982964 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cyborg.region_name = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.983358 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cyborg.service_name = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.983653 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cyborg.service_type = accelerator {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.983941 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cyborg.split_loggers = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.984211 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cyborg.status_code_retries = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.984537 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cyborg.status_code_retry_delay = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.984832 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cyborg.timeout = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.985134 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.985417 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] cyborg.version = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.986350 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] database.backend = sqlalchemy {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.986350 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] database.connection = **** {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.986350 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] database.connection_debug = 0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.986625 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] database.connection_parameters = {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.986850 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] database.connection_recycle_time = 3600 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.987120 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] database.connection_trace = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.987403 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] database.db_inc_retry_interval = True {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.987765 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] database.db_max_retries = 20 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.988147 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] database.db_max_retry_interval = 10 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.988548 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] database.db_retry_interval = 1 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.989152 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] database.max_overflow = 50 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.989348 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] database.max_pool_size = 5 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.989812 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] database.max_retries = 10 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.990173 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] database.mysql_enable_ndb = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.990599 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] database.mysql_sql_mode = TRADITIONAL {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.990993 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] database.mysql_wsrep_sync_wait = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.991280 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] database.pool_timeout = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.991592 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] database.retry_interval = 10 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.991887 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] database.slave_connection = **** {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.992196 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] database.sqlite_synchronous = True {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.992583 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] api_database.backend = sqlalchemy {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.992885 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] api_database.connection = **** {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.993148 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] api_database.connection_debug = 0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.993410 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] api_database.connection_parameters = {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.993680 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] api_database.connection_recycle_time = 3600 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.993992 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] api_database.connection_trace = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.994235 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] api_database.db_inc_retry_interval = True {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.994481 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] api_database.db_max_retries = 20 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.994727 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] api_database.db_max_retry_interval = 10 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.995344 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] api_database.db_retry_interval = 1 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.995344 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] api_database.max_overflow = 50 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.995529 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] api_database.max_pool_size = 5 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.996869 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] api_database.max_retries = 10 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.996869 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] api_database.mysql_enable_ndb = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.996869 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.996869 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] api_database.mysql_wsrep_sync_wait = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.997269 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] api_database.pool_timeout = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.997365 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] api_database.retry_interval = 10 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.997693 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] api_database.slave_connection = **** {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.998154 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] api_database.sqlite_synchronous = True {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.998578 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] devices.enabled_mdev_types = [] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.999019 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.999415 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] ephemeral_storage_encryption.enabled = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:05.999837 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] ephemeral_storage_encryption.key_size = 512 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.000243 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] glance.api_servers = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.000710 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] glance.cafile = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.001111 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] glance.certfile = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.001583 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] glance.collect_timing = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.002031 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] glance.connect_retries = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.002390 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] glance.connect_retry_delay = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.002816 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] glance.debug = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.003081 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] glance.default_trusted_certificate_ids = [] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.003343 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] glance.enable_certificate_validation = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.003654 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] glance.enable_rbd_download = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.003961 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] glance.endpoint_override = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.004294 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] glance.insecure = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.004617 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] glance.keyfile = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.004967 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] glance.max_version = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.005257 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] glance.min_version = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.005560 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] glance.num_retries = 3 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.005983 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] glance.rbd_ceph_conf = {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.006318 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] glance.rbd_connect_timeout = 5 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.006840 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] glance.rbd_pool = {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.006840 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] glance.rbd_user = {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.007056 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] glance.region_name = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.007405 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] glance.service_name = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.007791 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] glance.service_type = image {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.008448 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] glance.split_loggers = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.008448 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] glance.status_code_retries = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.012494 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] glance.status_code_retry_delay = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.012494 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] glance.timeout = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.012494 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.012494 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] glance.verify_glance_signatures = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.012494 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] glance.version = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.012494 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] guestfs.debug = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.013149 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] hyperv.config_drive_cdrom = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.013149 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] hyperv.config_drive_inject_password = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.013149 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.013149 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] hyperv.enable_instance_metrics_collection = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.013149 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] hyperv.enable_remotefx = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.013149 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] hyperv.instances_path_share = {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.013773 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] hyperv.iscsi_initiator_list = [] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.013773 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] hyperv.limit_cpu_features = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.013773 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.013773 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.014053 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] hyperv.power_state_check_timeframe = 60 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.014223 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] hyperv.power_state_event_polling_interval = 2 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.014526 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.014947 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] hyperv.use_multipath_io = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.015295 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] hyperv.volume_attach_retry_count = 10 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.015559 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] hyperv.volume_attach_retry_interval = 5 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.015819 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] hyperv.vswitch_name = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.016129 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.016458 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] mks.enabled = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.017031 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.017358 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] image_cache.manager_interval = 2400 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.017478 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] image_cache.precache_concurrency = 1 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.017853 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] image_cache.remove_unused_base_images = True {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.018210 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.018604 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.018979 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] image_cache.subdirectory_name = _base {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.019456 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] ironic.api_max_retries = 60 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.019850 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] ironic.api_retry_interval = 2 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.020179 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] ironic.auth_section = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.020577 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] ironic.auth_type = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.021620 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] ironic.cafile = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.021620 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] ironic.certfile = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.021620 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] ironic.collect_timing = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.022149 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] ironic.connect_retries = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.022403 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] ironic.connect_retry_delay = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.022675 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] ironic.endpoint_override = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.022933 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] ironic.insecure = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.023165 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] ironic.keyfile = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.023438 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] ironic.max_version = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.023699 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] ironic.min_version = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.024026 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] ironic.partition_key = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.024324 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] ironic.peer_list = [] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.024706 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] ironic.region_name = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.024993 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] ironic.serial_console_state_timeout = 10 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.025229 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] ironic.service_name = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.025497 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] ironic.service_type = baremetal {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.025764 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] ironic.split_loggers = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.026036 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] ironic.status_code_retries = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.026273 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] ironic.status_code_retry_delay = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.026625 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] ironic.timeout = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.027168 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.027168 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] ironic.version = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.027422 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.027759 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] key_manager.fixed_key = **** {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.028047 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.028315 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] barbican.barbican_api_version = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.028728 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] barbican.barbican_endpoint = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.029107 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] barbican.barbican_endpoint_type = public {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.032751 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] barbican.barbican_region_name = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.032751 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] barbican.cafile = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.032751 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] barbican.certfile = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.032751 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] barbican.collect_timing = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.032751 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] barbican.insecure = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.032751 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] barbican.keyfile = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.033260 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] barbican.number_of_retries = 60 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.033260 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] barbican.retry_delay = 1 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.033260 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] barbican.send_service_user_token = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.033260 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] barbican.split_loggers = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.033260 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] barbican.timeout = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.033260 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] barbican.verify_ssl = True {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.033787 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] barbican.verify_ssl_path = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.033787 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] barbican_service_user.auth_section = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.033787 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] barbican_service_user.auth_type = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.034177 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] barbican_service_user.cafile = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.034438 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] barbican_service_user.certfile = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.034781 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] barbican_service_user.collect_timing = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.035021 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] barbican_service_user.insecure = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.035373 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] barbican_service_user.keyfile = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.035787 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] barbican_service_user.split_loggers = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.036119 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] barbican_service_user.timeout = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.036390 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vault.approle_role_id = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.036812 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vault.approle_secret_id = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.037399 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vault.cafile = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.037399 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vault.certfile = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.037715 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vault.collect_timing = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.038051 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vault.insecure = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.038296 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vault.keyfile = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.038688 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vault.kv_mountpoint = secret {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.039063 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vault.kv_version = 2 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.039448 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vault.namespace = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.039844 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vault.root_token_id = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.040257 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vault.split_loggers = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.040715 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vault.ssl_ca_crt_file = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.041114 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vault.timeout = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.041487 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vault.use_ssl = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.041815 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.042214 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] keystone.cafile = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.042602 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] keystone.certfile = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.042893 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] keystone.collect_timing = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.043143 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] keystone.connect_retries = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.043446 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] keystone.connect_retry_delay = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.043825 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] keystone.endpoint_override = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.044106 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] keystone.insecure = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.045222 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] keystone.keyfile = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.045222 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] keystone.max_version = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.045682 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] keystone.min_version = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.045915 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] keystone.region_name = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.046183 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] keystone.service_name = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.046545 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] keystone.service_type = identity {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.047488 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] keystone.split_loggers = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.047488 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] keystone.status_code_retries = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.047794 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] keystone.status_code_retry_delay = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.048131 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] keystone.timeout = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.048527 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.048855 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] keystone.version = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.049215 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.connection_uri = {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.049550 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.cpu_mode = custom {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.049916 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.cpu_model_extra_flags = [] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.050483 np0035104604 nova-compute[107505]: WARNING oslo_config.cfg [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] Deprecated: Option "cpu_model" from group "libvirt" is deprecated. Use option "cpu_models" from group "libvirt". Aug 30 14:01:06.050926 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.cpu_models = ['Nehalem'] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.051311 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.cpu_power_governor_high = performance {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.051644 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.cpu_power_governor_low = powersave {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.052062 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.cpu_power_management = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.052445 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.052771 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.device_detach_attempts = 8 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.053158 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.device_detach_timeout = 20 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.053574 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.disk_cachemodes = ['network=writeback'] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.053865 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.disk_prefix = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.054126 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.enabled_perf_events = [] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.054506 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.file_backed_memory = 0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.054912 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.gid_maps = [] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.055220 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.hw_disk_discard = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.055502 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.hw_machine_type = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.055800 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.images_rbd_ceph_conf = /etc/ceph/ceph.conf {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.056124 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.056450 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.056756 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.images_rbd_glance_store_name = {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.057127 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.images_rbd_pool = vms {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.057583 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.images_type = rbd {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.057668 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.images_volume_group = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.058021 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.inject_key = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.058382 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.inject_partition = -2 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.058707 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.inject_password = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.059003 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.iscsi_iface = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.059314 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.iser_use_multipath = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.059590 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.live_migration_bandwidth = 0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.059875 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.live_migration_completion_timeout = 800 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.060161 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.live_migration_downtime = 500 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.060480 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.live_migration_downtime_delay = 75 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.060908 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.live_migration_downtime_steps = 10 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.061230 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.live_migration_inbound_addr = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.061565 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.live_migration_permit_auto_converge = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.061936 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.live_migration_permit_post_copy = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.062654 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.live_migration_scheme = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.062765 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.live_migration_timeout_action = abort {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.063236 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.live_migration_tunnelled = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.063734 np0035104604 nova-compute[107505]: WARNING oslo_config.cfg [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal ( Aug 30 14:01:06.063734 np0035104604 nova-compute[107505]: live_migration_uri is deprecated for removal in favor of two other options that Aug 30 14:01:06.063734 np0035104604 nova-compute[107505]: allow to change live migration scheme and target URI: ``live_migration_scheme`` Aug 30 14:01:06.063734 np0035104604 nova-compute[107505]: and ``live_migration_inbound_addr`` respectively. Aug 30 14:01:06.063734 np0035104604 nova-compute[107505]: ). Its value may be silently ignored in the future. Aug 30 14:01:06.064234 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.live_migration_uri = qemu+ssh://stack@%s/system {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.064549 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.live_migration_with_native_tls = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.064884 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.max_queues = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.065230 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.mem_stats_period_seconds = 10 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.065566 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.nfs_mount_options = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.066112 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.nfs_mount_point_base = /opt/stack/data/nova/mnt {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.066504 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.num_aoe_discover_tries = 3 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.066835 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.num_iser_scan_tries = 5 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.067294 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.num_memory_encrypted_guests = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.067848 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.num_nvme_discover_tries = 5 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.068155 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.num_pcie_ports = 0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.068600 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.num_volume_scan_tries = 5 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.069024 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.pmem_namespaces = [] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.069382 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.quobyte_client_cfg = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.069855 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.quobyte_mount_point_base = /opt/stack/data/nova/mnt {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.070078 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.rbd_connect_timeout = 5 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.070369 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.070687 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.070961 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.rbd_secret_uuid = 2b09afd7-f4eb-416e-8f88-5f78f2b270f5 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.071219 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.rbd_user = cinder {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.071464 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.realtime_scheduler_priority = 1 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.071746 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.remote_filesystem_transport = ssh {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.071995 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.rescue_image_id = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.072236 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.rescue_kernel_id = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.072516 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.rescue_ramdisk_id = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.072806 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.rng_dev_path = /dev/urandom {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.073048 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.rx_queue_size = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.073304 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.smbfs_mount_options = {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.073621 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.smbfs_mount_point_base = /opt/stack/data/nova/mnt {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.073944 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.snapshot_compression = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.074189 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.snapshot_image_format = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.074717 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.074782 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.sparse_logical_volumes = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.075117 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.swtpm_enabled = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.075419 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.swtpm_group = tss {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.075789 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.swtpm_user = tss {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.076225 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.sysinfo_serial = unique {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.076665 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.tb_cache_size = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.077065 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.tx_queue_size = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.077467 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.uid_maps = [] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.078026 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.use_virtio_for_bridges = True {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.078195 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.virt_type = qemu {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.078501 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.volume_clear = zero {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.078759 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.volume_clear_size = 0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.079014 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.volume_use_multipath = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.079344 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.vzstorage_cache_path = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.079795 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.080177 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.vzstorage_mount_group = qemu {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.080480 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.vzstorage_mount_opts = [] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.080812 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.081307 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/nova/mnt {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.081720 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.vzstorage_mount_user = stack {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.082069 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.082355 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] neutron.auth_section = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.082648 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] neutron.auth_type = password {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.082932 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] neutron.cafile = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.083204 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] neutron.certfile = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.083580 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] neutron.collect_timing = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.083913 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] neutron.connect_retries = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.084298 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] neutron.connect_retry_delay = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.084642 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] neutron.default_floating_pool = public {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.084948 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] neutron.endpoint_override = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.085246 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] neutron.extension_sync_interval = 600 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.085547 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] neutron.http_retries = 3 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.085933 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] neutron.insecure = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.086273 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] neutron.keyfile = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.086686 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] neutron.max_version = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.088066 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] neutron.metadata_proxy_shared_secret = **** {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.088066 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] neutron.min_version = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.088066 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] neutron.ovs_bridge = br-int {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.088066 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] neutron.physnets = [] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.088437 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] neutron.region_name = RegionOne {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.088826 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] neutron.service_metadata_proxy = True {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.089122 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] neutron.service_name = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.089417 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] neutron.service_type = network {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.089702 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] neutron.split_loggers = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.090001 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] neutron.status_code_retries = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.090282 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] neutron.status_code_retry_delay = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.090561 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] neutron.timeout = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.090873 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.091150 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] neutron.version = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.091575 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] notifications.bdms_in_notifications = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.091986 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] notifications.default_level = INFO {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.092398 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] notifications.notification_format = unversioned {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.092701 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] notifications.notify_on_state_change = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.093019 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.093325 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] pci.alias = [] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.093584 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] pci.device_spec = [] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.093855 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] pci.report_in_placement = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.094143 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] placement.auth_section = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.094397 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] placement.auth_type = password {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.094741 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] placement.auth_url = https://149.202.177.86/identity {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.095122 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] placement.cafile = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.095475 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] placement.certfile = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.095805 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] placement.collect_timing = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.096136 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] placement.connect_retries = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.108620 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] placement.connect_retry_delay = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.109717 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] placement.default_domain_id = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.109717 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] placement.default_domain_name = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.110109 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] placement.domain_id = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.110444 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] placement.domain_name = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.110781 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] placement.endpoint_override = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.111181 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] placement.insecure = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.111542 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] placement.keyfile = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.111874 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] placement.max_version = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.112207 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] placement.min_version = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.112578 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] placement.password = **** {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.112903 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] placement.project_domain_id = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.113240 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] placement.project_domain_name = Default {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.113578 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] placement.project_id = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.113967 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] placement.project_name = service {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.114329 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] placement.region_name = RegionOne {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.114654 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] placement.service_name = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.114992 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] placement.service_type = placement {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.115331 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] placement.split_loggers = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.115666 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] placement.status_code_retries = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.116211 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] placement.status_code_retry_delay = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.116450 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] placement.system_scope = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.116827 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] placement.timeout = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.117150 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] placement.trust_id = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.117476 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] placement.user_domain_id = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.117852 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] placement.user_domain_name = Default {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.118198 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] placement.user_id = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.118536 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] placement.username = placement {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.118925 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.119250 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] placement.version = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.119599 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] quota.cores = 20 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.119963 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] quota.count_usage_from_placement = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.120305 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.120663 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] quota.injected_file_content_bytes = 10240 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.121000 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] quota.injected_file_path_length = 255 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.121333 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] quota.injected_files = 5 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.121669 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] quota.instances = 10 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.122047 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] quota.key_pairs = 100 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.122374 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] quota.metadata_items = 128 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.122710 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] quota.ram = 51200 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.123041 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] quota.recheck_quota = True {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.123386 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] quota.server_group_members = 10 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.123715 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] quota.server_groups = 10 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.124050 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] rdp.enabled = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.124803 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.125409 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.125515 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.125953 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] scheduler.image_metadata_prefilter = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.126397 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.126825 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] scheduler.max_attempts = 3 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.127249 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] scheduler.max_placement_results = 1000 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.127611 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.127850 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] scheduler.query_placement_for_availability_zone = True {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.128069 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] scheduler.query_placement_for_image_type_support = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.128327 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.128639 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] scheduler.workers = 2 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.128997 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.129315 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.129627 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.130022 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.130252 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.130517 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.130795 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.131134 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.131421 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] filter_scheduler.host_subset_size = 1 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.131702 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] filter_scheduler.hypervisor_version_weight_multiplier = 1.0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.132058 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] filter_scheduler.image_properties_default_architecture = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.132292 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.132578 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] filter_scheduler.isolated_hosts = [] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.132878 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] filter_scheduler.isolated_images = [] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.133154 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] filter_scheduler.max_instances_per_host = 50 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.133399 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.133654 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] filter_scheduler.pci_in_placement = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.133950 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.134213 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.134469 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.134715 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.134959 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.135207 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.135788 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] filter_scheduler.track_instance_changes = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.135788 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.136252 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] metrics.required = True {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.136324 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] metrics.weight_multiplier = 1.0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.136750 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] metrics.weight_of_unavailable = -10000.0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.137185 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] metrics.weight_setting = [] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.137867 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.138282 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] serial_console.enabled = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.138746 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] serial_console.port_range = 10000:20000 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.138895 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.139203 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.139418 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] serial_console.serialproxy_port = 6083 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.139679 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] service_user.auth_section = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.139932 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] service_user.auth_type = password {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.140173 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] service_user.cafile = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.140552 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] service_user.certfile = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.141011 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] service_user.collect_timing = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.141407 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] service_user.insecure = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.141674 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] service_user.keyfile = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.142061 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] service_user.send_service_user_token = True {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.142499 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] service_user.split_loggers = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.142877 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] service_user.timeout = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.143207 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] spice.agent_enabled = True {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.143571 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] spice.enabled = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.144092 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] spice.html5proxy_base_url = http://149.202.177.86:6081/spice_auto.html {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.144449 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] spice.html5proxy_host = 0.0.0.0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.144820 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] spice.html5proxy_port = 6082 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.145151 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] spice.image_compression = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.145474 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] spice.jpeg_compression = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.145905 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] spice.playback_compression = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.146233 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] spice.server_listen = 127.0.0.1 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.146589 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.147111 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] spice.streaming_mode = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.147577 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] spice.zlib_compression = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.147911 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] upgrade_levels.baseapi = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.148184 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] upgrade_levels.cert = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.148548 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] upgrade_levels.compute = auto {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.148919 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] upgrade_levels.conductor = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.149178 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] upgrade_levels.scheduler = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.149435 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vendordata_dynamic_auth.auth_section = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.149856 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vendordata_dynamic_auth.auth_type = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.150281 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vendordata_dynamic_auth.cafile = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.150626 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vendordata_dynamic_auth.certfile = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.150960 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vendordata_dynamic_auth.collect_timing = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.151348 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vendordata_dynamic_auth.insecure = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.151638 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vendordata_dynamic_auth.keyfile = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.151941 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vendordata_dynamic_auth.split_loggers = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.152353 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vendordata_dynamic_auth.timeout = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.152812 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vmware.api_retry_count = 10 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.153156 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vmware.ca_file = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.153453 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vmware.cache_prefix = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.153749 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vmware.cluster_name = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.154065 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vmware.connection_pool_size = 10 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.154354 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vmware.console_delay_seconds = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.154639 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vmware.datastore_regex = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.154916 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vmware.host_ip = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.155253 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vmware.host_password = **** {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.155579 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vmware.host_port = 443 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.155934 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vmware.host_username = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.156278 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vmware.insecure = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.156732 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vmware.integration_bridge = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.157111 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vmware.maximum_objects = 100 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.157496 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vmware.pbm_default_policy = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.157849 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vmware.pbm_enabled = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.158167 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vmware.pbm_wsdl_location = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.158488 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.158741 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vmware.serial_port_proxy_uri = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.158978 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vmware.serial_port_service_uri = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.159242 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vmware.task_poll_interval = 0.5 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.159700 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vmware.use_linked_clone = True {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.160084 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vmware.vnc_keymap = en-us {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.160382 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vmware.vnc_port = 5900 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.160675 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vmware.vnc_port_total = 10000 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.161044 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vnc.auth_schemes = ['none'] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.161426 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vnc.enabled = True {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.162131 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vnc.novncproxy_base_url = http://149.202.177.86:6080/vnc_auto.html {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.162515 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.162799 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vnc.novncproxy_port = 6080 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.163078 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vnc.server_listen = 0.0.0.0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.163389 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vnc.server_proxyclient_address = 149.202.177.86 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.163673 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vnc.vencrypt_ca_certs = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.163957 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vnc.vencrypt_client_cert = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.164229 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vnc.vencrypt_client_key = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.164567 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.164873 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] workarounds.disable_fallback_pcpu_query = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.165179 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] workarounds.disable_group_policy_check_upcall = True {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.165571 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.165913 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] workarounds.disable_rootwrap = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.166278 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] workarounds.enable_numa_live_migration = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.166697 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.167108 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.167532 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] workarounds.handle_virt_lifecycle_events = True {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.168085 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] workarounds.libvirt_disable_apic = True {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.168508 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] workarounds.never_download_image_if_on_rbd = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.168946 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.169355 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.169805 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.170242 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.170673 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.171095 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.171514 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.171959 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.172394 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.172902 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.173307 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] wsgi.client_socket_timeout = 900 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.174552 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] wsgi.default_pool_size = 1000 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.175794 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] wsgi.keep_alive = True {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.175928 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] wsgi.max_header_line = 16384 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.176251 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] wsgi.secure_proxy_ssl_header = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.176555 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] wsgi.ssl_ca_file = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.176871 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] wsgi.ssl_cert_file = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.177214 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] wsgi.ssl_key_file = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.177638 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] wsgi.tcp_keepidle = 600 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.178647 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.178647 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] zvm.ca_file = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.178958 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] zvm.cloud_connector_url = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.179351 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] zvm.image_tmp_path = /opt/stack/data/nova/images {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.179990 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] zvm.reachable_timeout = 300 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.181230 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_policy.enforce_new_defaults = True {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.181230 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_policy.enforce_scope = True {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.181230 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_policy.policy_default_rule = default {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.181444 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.181864 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_policy.policy_file = policy.yaml {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.182284 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.182627 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.183035 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.183445 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.183845 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.184263 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.184598 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.184981 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] profiler.connection_string = messaging:// {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.185317 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] profiler.enabled = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.185767 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] profiler.es_doc_type = notification {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.186191 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] profiler.es_scroll_size = 10000 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.186623 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] profiler.es_scroll_time = 2m {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.187039 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] profiler.filter_error_trace = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.187441 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] profiler.hmac_keys = **** {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.187866 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] profiler.sentinel_service_name = mymaster {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.188289 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] profiler.socket_timeout = 0.1 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.188728 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] profiler.trace_requests = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.189145 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] profiler.trace_sqlalchemy = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.189579 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] profiler_jaeger.process_tags = {} {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.190011 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] profiler_jaeger.service_name_prefix = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.190433 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] profiler_otlp.service_name_prefix = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.190833 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] remote_debug.host = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.191238 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] remote_debug.port = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.191685 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.192085 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.192525 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.192931 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.193341 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.193785 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.194166 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.194554 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.194972 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.195384 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.195817 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.196234 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.196708 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.197102 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.197521 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.198061 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.198387 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.198791 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.199238 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.199556 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.199900 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.200287 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.200726 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.201141 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.201565 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.201991 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_messaging_rabbit.ssl = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.202432 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.202850 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.203259 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.203684 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.204109 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_messaging_rabbit.ssl_version = {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.204598 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.205009 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_messaging_notifications.retry = -1 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.205454 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.205900 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_messaging_notifications.transport_url = **** {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.206344 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_limit.auth_section = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.206763 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_limit.auth_type = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.207144 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_limit.cafile = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.207555 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_limit.certfile = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.207993 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_limit.collect_timing = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.209182 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_limit.connect_retries = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.209182 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_limit.connect_retry_delay = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.209330 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_limit.endpoint_id = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.209743 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_limit.endpoint_override = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.210132 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_limit.insecure = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.210535 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_limit.keyfile = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.210948 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_limit.max_version = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.211335 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_limit.min_version = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.211743 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_limit.region_name = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.212152 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_limit.service_name = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.212581 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_limit.service_type = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.212998 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_limit.split_loggers = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.213399 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_limit.status_code_retries = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.213828 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_limit.status_code_retry_delay = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.214263 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_limit.timeout = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.214666 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_limit.valid_interfaces = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.215067 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_limit.version = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.215497 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_reports.file_event_handler = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.215886 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_reports.file_event_handler_interval = 1 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.216306 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] oslo_reports.log_dir = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.216719 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.217104 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vif_plug_linux_bridge_privileged.group = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.217529 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.217962 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.218374 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vif_plug_linux_bridge_privileged.thread_pool_size = 8 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.218823 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vif_plug_linux_bridge_privileged.user = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.219271 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.219677 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vif_plug_ovs_privileged.group = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.220148 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vif_plug_ovs_privileged.helper_command = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.220513 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.221131 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vif_plug_ovs_privileged.thread_pool_size = 8 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.221320 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] vif_plug_ovs_privileged.user = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.221741 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] os_vif_linux_bridge.flat_interface = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.222171 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.222600 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.223024 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.223468 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.223810 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.224083 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.224336 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] os_vif_linux_bridge.vlan_interface = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.224682 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] os_vif_ovs.isolate_vif = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.224889 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.225157 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.225413 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.225674 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] os_vif_ovs.ovsdb_interface = native {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.225935 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] os_vif_ovs.per_port_bridge = False {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.226182 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] os_brick.lock_path = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.226436 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] privsep_osbrick.capabilities = [21] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.226674 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] privsep_osbrick.group = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.226913 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] privsep_osbrick.helper_command = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.227160 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.227403 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] privsep_osbrick.thread_pool_size = 8 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.227642 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] privsep_osbrick.user = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.227895 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.228138 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] nova_sys_admin.group = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.228373 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] nova_sys_admin.helper_command = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.228648 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.228895 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] nova_sys_admin.thread_pool_size = 8 {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.229127 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] nova_sys_admin.user = None {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} Aug 30 14:01:06.229325 np0035104604 nova-compute[107505]: DEBUG oslo_service.service [None req-57093944-03fa-4a5f-826a-6048b3559346 None None] ******************************************************************************** {{(pid=107505) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2613}} Aug 30 14:01:06.230457 np0035104604 nova-compute[107505]: INFO nova.service [-] Starting compute node (version 27.1.0) Aug 30 14:01:06.247288 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Starting native event thread {{(pid=107505) _init_events /opt/stack/nova/nova/virt/libvirt/host.py:492}} Aug 30 14:01:06.247942 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Starting green dispatch thread {{(pid=107505) _init_events /opt/stack/nova/nova/virt/libvirt/host.py:498}} Aug 30 14:01:06.248554 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Starting connection event dispatch thread {{(pid=107505) _init_events /opt/stack/nova/nova/virt/libvirt/host.py:501}} Aug 30 14:01:06.249004 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Connecting to libvirt: qemu:///system {{(pid=107505) _get_new_connection /opt/stack/nova/nova/virt/libvirt/host.py:506}} Aug 30 14:01:06.260130 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Registering for lifecycle events {{(pid=107505) _get_new_connection /opt/stack/nova/nova/virt/libvirt/host.py:512}} Aug 30 14:01:06.264215 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Registering for connection events: {{(pid=107505) _get_new_connection /opt/stack/nova/nova/virt/libvirt/host.py:533}} Aug 30 14:01:06.265382 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Connection event '1' reason 'None' Aug 30 14:01:06.294696 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Cannot update service status on host "np0035104604" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host np0035104604 could not be found. Aug 30 14:01:06.295027 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.volume.mount [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Initialising _HostMountState generation 0 {{(pid=107505) host_up /opt/stack/nova/nova/virt/libvirt/volume/mount.py:130}} Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Libvirt host capabilities Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: ed1e0f24-f0a1-46a6-80a0-067b49fc4977 Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: x86_64 Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: Haswell-noTSX Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: Intel Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: tcp Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: rdma Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: 7932464 Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: 1983116 Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: 0 Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: 0 Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: Aug 30 14:01:13.828765 np0035104604 nova-compute[107505]: Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: none Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: 0 Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: dac Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: 0 Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: +64055:+108 Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: +64055:+108 Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: hvm Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: 64 Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-alpha Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: clipper Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: hvm Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: 32 Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-arm Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: integratorcp Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: ast2600-evb Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: borzoi Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: spitz Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: virt-2.7 Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: nuri Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: mcimx7d-sabre Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: romulus-bmc Aug 30 14:01:13.833312 np0035104604 nova-compute[107505]: virt-3.0 Aug 30 14:01:13.835839 np0035104604 nova-compute[107505]: virt-5.0 Aug 30 14:01:13.835839 np0035104604 nova-compute[107505]: npcm750-evb Aug 30 14:01:13.835839 np0035104604 nova-compute[107505]: virt-2.10 Aug 30 14:01:13.835839 np0035104604 nova-compute[107505]: rainier-bmc Aug 30 14:01:13.835839 np0035104604 nova-compute[107505]: mps3-an547 Aug 30 14:01:13.835839 np0035104604 nova-compute[107505]: musca-b1 Aug 30 14:01:13.835839 np0035104604 nova-compute[107505]: realview-pbx-a9 Aug 30 14:01:13.835839 np0035104604 nova-compute[107505]: versatileab Aug 30 14:01:13.835839 np0035104604 nova-compute[107505]: kzm Aug 30 14:01:13.835839 np0035104604 nova-compute[107505]: virt-2.8 Aug 30 14:01:13.835839 np0035104604 nova-compute[107505]: musca-a Aug 30 14:01:13.835839 np0035104604 nova-compute[107505]: virt-3.1 Aug 30 14:01:13.835839 np0035104604 nova-compute[107505]: mcimx6ul-evk Aug 30 14:01:13.835839 np0035104604 nova-compute[107505]: virt-5.1 Aug 30 14:01:13.835839 np0035104604 nova-compute[107505]: smdkc210 Aug 30 14:01:13.835839 np0035104604 nova-compute[107505]: sx1 Aug 30 14:01:13.835839 np0035104604 nova-compute[107505]: virt-2.11 Aug 30 14:01:13.835839 np0035104604 nova-compute[107505]: imx25-pdk Aug 30 14:01:13.835839 np0035104604 nova-compute[107505]: stm32vldiscovery Aug 30 14:01:13.835839 np0035104604 nova-compute[107505]: virt-2.9 Aug 30 14:01:13.835839 np0035104604 nova-compute[107505]: orangepi-pc Aug 30 14:01:13.835839 np0035104604 nova-compute[107505]: quanta-q71l-bmc Aug 30 14:01:13.835839 np0035104604 nova-compute[107505]: z2 Aug 30 14:01:13.835839 np0035104604 nova-compute[107505]: virt-5.2 Aug 30 14:01:13.835839 np0035104604 nova-compute[107505]: xilinx-zynq-a9 Aug 30 14:01:13.835839 np0035104604 nova-compute[107505]: tosa Aug 30 14:01:13.835839 np0035104604 nova-compute[107505]: mps2-an500 Aug 30 14:01:13.835839 np0035104604 nova-compute[107505]: virt-2.12 Aug 30 14:01:13.835839 np0035104604 nova-compute[107505]: mps2-an521 Aug 30 14:01:13.835839 np0035104604 nova-compute[107505]: sabrelite Aug 30 14:01:13.835839 np0035104604 nova-compute[107505]: mps2-an511 Aug 30 14:01:13.835839 np0035104604 nova-compute[107505]: canon-a1100 Aug 30 14:01:13.835839 np0035104604 nova-compute[107505]: realview-eb Aug 30 14:01:13.835839 np0035104604 nova-compute[107505]: quanta-gbs-bmc Aug 30 14:01:13.835839 np0035104604 nova-compute[107505]: emcraft-sf2 Aug 30 14:01:13.835839 np0035104604 nova-compute[107505]: realview-pb-a8 Aug 30 14:01:13.835839 np0035104604 nova-compute[107505]: virt-4.0 Aug 30 14:01:13.835839 np0035104604 nova-compute[107505]: raspi1ap Aug 30 14:01:13.835839 np0035104604 nova-compute[107505]: palmetto-bmc Aug 30 14:01:13.835839 np0035104604 nova-compute[107505]: sx1-v1 Aug 30 14:01:13.835839 np0035104604 nova-compute[107505]: n810 Aug 30 14:01:13.835839 np0035104604 nova-compute[107505]: g220a-bmc Aug 30 14:01:13.835839 np0035104604 nova-compute[107505]: n800 Aug 30 14:01:13.837690 np0035104604 nova-compute[107505]: tacoma-bmc Aug 30 14:01:13.837690 np0035104604 nova-compute[107505]: virt-4.1 Aug 30 14:01:13.837690 np0035104604 nova-compute[107505]: quanta-gsj Aug 30 14:01:13.837690 np0035104604 nova-compute[107505]: versatilepb Aug 30 14:01:13.837690 np0035104604 nova-compute[107505]: terrier Aug 30 14:01:13.837690 np0035104604 nova-compute[107505]: mainstone Aug 30 14:01:13.837690 np0035104604 nova-compute[107505]: realview-eb-mpcore Aug 30 14:01:13.837690 np0035104604 nova-compute[107505]: supermicrox11-bmc Aug 30 14:01:13.837690 np0035104604 nova-compute[107505]: virt-4.2 Aug 30 14:01:13.837690 np0035104604 nova-compute[107505]: witherspoon-bmc Aug 30 14:01:13.837690 np0035104604 nova-compute[107505]: mps3-an524 Aug 30 14:01:13.837690 np0035104604 nova-compute[107505]: swift-bmc Aug 30 14:01:13.837690 np0035104604 nova-compute[107505]: kudo-bmc Aug 30 14:01:13.837690 np0035104604 nova-compute[107505]: vexpress-a9 Aug 30 14:01:13.837690 np0035104604 nova-compute[107505]: midway Aug 30 14:01:13.837690 np0035104604 nova-compute[107505]: musicpal Aug 30 14:01:13.837690 np0035104604 nova-compute[107505]: lm3s811evb Aug 30 14:01:13.837690 np0035104604 nova-compute[107505]: lm3s6965evb Aug 30 14:01:13.837690 np0035104604 nova-compute[107505]: microbit Aug 30 14:01:13.837690 np0035104604 nova-compute[107505]: mps2-an505 Aug 30 14:01:13.837690 np0035104604 nova-compute[107505]: mps2-an385 Aug 30 14:01:13.837690 np0035104604 nova-compute[107505]: virt-6.0 Aug 30 14:01:13.837690 np0035104604 nova-compute[107505]: cubieboard Aug 30 14:01:13.837690 np0035104604 nova-compute[107505]: verdex Aug 30 14:01:13.837690 np0035104604 nova-compute[107505]: netduino2 Aug 30 14:01:13.837690 np0035104604 nova-compute[107505]: mps2-an386 Aug 30 14:01:13.837690 np0035104604 nova-compute[107505]: virt-6.1 Aug 30 14:01:13.837690 np0035104604 nova-compute[107505]: raspi2b Aug 30 14:01:13.837690 np0035104604 nova-compute[107505]: vexpress-a15 Aug 30 14:01:13.837690 np0035104604 nova-compute[107505]: fuji-bmc Aug 30 14:01:13.837690 np0035104604 nova-compute[107505]: virt-6.2 Aug 30 14:01:13.837690 np0035104604 nova-compute[107505]: virt Aug 30 14:01:13.837690 np0035104604 nova-compute[107505]: sonorapass-bmc Aug 30 14:01:13.837690 np0035104604 nova-compute[107505]: cheetah Aug 30 14:01:13.837690 np0035104604 nova-compute[107505]: virt-2.6 Aug 30 14:01:13.837690 np0035104604 nova-compute[107505]: ast2500-evb Aug 30 14:01:13.837690 np0035104604 nova-compute[107505]: highbank Aug 30 14:01:13.837690 np0035104604 nova-compute[107505]: akita Aug 30 14:01:13.837690 np0035104604 nova-compute[107505]: connex Aug 30 14:01:13.837690 np0035104604 nova-compute[107505]: netduinoplus2 Aug 30 14:01:13.837690 np0035104604 nova-compute[107505]: collie Aug 30 14:01:13.837690 np0035104604 nova-compute[107505]: raspi0 Aug 30 14:01:13.839242 np0035104604 nova-compute[107505]: fp5280g2-bmc Aug 30 14:01:13.839242 np0035104604 nova-compute[107505]: Aug 30 14:01:13.839242 np0035104604 nova-compute[107505]: Aug 30 14:01:13.839242 np0035104604 nova-compute[107505]: Aug 30 14:01:13.839242 np0035104604 nova-compute[107505]: Aug 30 14:01:13.839242 np0035104604 nova-compute[107505]: Aug 30 14:01:13.839242 np0035104604 nova-compute[107505]: Aug 30 14:01:13.839242 np0035104604 nova-compute[107505]: Aug 30 14:01:13.839242 np0035104604 nova-compute[107505]: Aug 30 14:01:13.839242 np0035104604 nova-compute[107505]: Aug 30 14:01:13.839242 np0035104604 nova-compute[107505]: hvm Aug 30 14:01:13.839242 np0035104604 nova-compute[107505]: Aug 30 14:01:13.839242 np0035104604 nova-compute[107505]: 32 Aug 30 14:01:13.839242 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-arm Aug 30 14:01:13.839242 np0035104604 nova-compute[107505]: integratorcp Aug 30 14:01:13.839242 np0035104604 nova-compute[107505]: ast2600-evb Aug 30 14:01:13.839242 np0035104604 nova-compute[107505]: borzoi Aug 30 14:01:13.839242 np0035104604 nova-compute[107505]: spitz Aug 30 14:01:13.839242 np0035104604 nova-compute[107505]: virt-2.7 Aug 30 14:01:13.839242 np0035104604 nova-compute[107505]: nuri Aug 30 14:01:13.839242 np0035104604 nova-compute[107505]: mcimx7d-sabre Aug 30 14:01:13.839242 np0035104604 nova-compute[107505]: romulus-bmc Aug 30 14:01:13.839242 np0035104604 nova-compute[107505]: virt-3.0 Aug 30 14:01:13.839242 np0035104604 nova-compute[107505]: virt-5.0 Aug 30 14:01:13.839242 np0035104604 nova-compute[107505]: npcm750-evb Aug 30 14:01:13.839242 np0035104604 nova-compute[107505]: virt-2.10 Aug 30 14:01:13.839242 np0035104604 nova-compute[107505]: rainier-bmc Aug 30 14:01:13.839242 np0035104604 nova-compute[107505]: mps3-an547 Aug 30 14:01:13.839242 np0035104604 nova-compute[107505]: musca-b1 Aug 30 14:01:13.839242 np0035104604 nova-compute[107505]: realview-pbx-a9 Aug 30 14:01:13.839242 np0035104604 nova-compute[107505]: versatileab Aug 30 14:01:13.839242 np0035104604 nova-compute[107505]: kzm Aug 30 14:01:13.839242 np0035104604 nova-compute[107505]: virt-2.8 Aug 30 14:01:13.839242 np0035104604 nova-compute[107505]: musca-a Aug 30 14:01:13.839242 np0035104604 nova-compute[107505]: virt-3.1 Aug 30 14:01:13.839242 np0035104604 nova-compute[107505]: mcimx6ul-evk Aug 30 14:01:13.839242 np0035104604 nova-compute[107505]: virt-5.1 Aug 30 14:01:13.839242 np0035104604 nova-compute[107505]: smdkc210 Aug 30 14:01:13.839242 np0035104604 nova-compute[107505]: sx1 Aug 30 14:01:13.839242 np0035104604 nova-compute[107505]: virt-2.11 Aug 30 14:01:13.839242 np0035104604 nova-compute[107505]: imx25-pdk Aug 30 14:01:13.839242 np0035104604 nova-compute[107505]: stm32vldiscovery Aug 30 14:01:13.839242 np0035104604 nova-compute[107505]: virt-2.9 Aug 30 14:01:13.839242 np0035104604 nova-compute[107505]: orangepi-pc Aug 30 14:01:13.839242 np0035104604 nova-compute[107505]: quanta-q71l-bmc Aug 30 14:01:13.839242 np0035104604 nova-compute[107505]: z2 Aug 30 14:01:13.839242 np0035104604 nova-compute[107505]: virt-5.2 Aug 30 14:01:13.839242 np0035104604 nova-compute[107505]: xilinx-zynq-a9 Aug 30 14:01:13.839242 np0035104604 nova-compute[107505]: tosa Aug 30 14:01:13.841319 np0035104604 nova-compute[107505]: mps2-an500 Aug 30 14:01:13.841319 np0035104604 nova-compute[107505]: virt-2.12 Aug 30 14:01:13.841319 np0035104604 nova-compute[107505]: mps2-an521 Aug 30 14:01:13.841319 np0035104604 nova-compute[107505]: sabrelite Aug 30 14:01:13.841319 np0035104604 nova-compute[107505]: mps2-an511 Aug 30 14:01:13.841319 np0035104604 nova-compute[107505]: canon-a1100 Aug 30 14:01:13.841319 np0035104604 nova-compute[107505]: realview-eb Aug 30 14:01:13.841319 np0035104604 nova-compute[107505]: quanta-gbs-bmc Aug 30 14:01:13.841319 np0035104604 nova-compute[107505]: emcraft-sf2 Aug 30 14:01:13.841319 np0035104604 nova-compute[107505]: realview-pb-a8 Aug 30 14:01:13.841319 np0035104604 nova-compute[107505]: virt-4.0 Aug 30 14:01:13.841319 np0035104604 nova-compute[107505]: raspi1ap Aug 30 14:01:13.841319 np0035104604 nova-compute[107505]: palmetto-bmc Aug 30 14:01:13.841319 np0035104604 nova-compute[107505]: sx1-v1 Aug 30 14:01:13.841319 np0035104604 nova-compute[107505]: n810 Aug 30 14:01:13.841319 np0035104604 nova-compute[107505]: g220a-bmc Aug 30 14:01:13.841319 np0035104604 nova-compute[107505]: n800 Aug 30 14:01:13.841319 np0035104604 nova-compute[107505]: tacoma-bmc Aug 30 14:01:13.841319 np0035104604 nova-compute[107505]: virt-4.1 Aug 30 14:01:13.841319 np0035104604 nova-compute[107505]: quanta-gsj Aug 30 14:01:13.841319 np0035104604 nova-compute[107505]: versatilepb Aug 30 14:01:13.841319 np0035104604 nova-compute[107505]: terrier Aug 30 14:01:13.841319 np0035104604 nova-compute[107505]: mainstone Aug 30 14:01:13.841319 np0035104604 nova-compute[107505]: realview-eb-mpcore Aug 30 14:01:13.841319 np0035104604 nova-compute[107505]: supermicrox11-bmc Aug 30 14:01:13.841319 np0035104604 nova-compute[107505]: virt-4.2 Aug 30 14:01:13.841319 np0035104604 nova-compute[107505]: witherspoon-bmc Aug 30 14:01:13.841319 np0035104604 nova-compute[107505]: mps3-an524 Aug 30 14:01:13.841319 np0035104604 nova-compute[107505]: swift-bmc Aug 30 14:01:13.841319 np0035104604 nova-compute[107505]: kudo-bmc Aug 30 14:01:13.841319 np0035104604 nova-compute[107505]: vexpress-a9 Aug 30 14:01:13.841319 np0035104604 nova-compute[107505]: midway Aug 30 14:01:13.841319 np0035104604 nova-compute[107505]: musicpal Aug 30 14:01:13.841319 np0035104604 nova-compute[107505]: lm3s811evb Aug 30 14:01:13.841319 np0035104604 nova-compute[107505]: lm3s6965evb Aug 30 14:01:13.841319 np0035104604 nova-compute[107505]: microbit Aug 30 14:01:13.841319 np0035104604 nova-compute[107505]: mps2-an505 Aug 30 14:01:13.841319 np0035104604 nova-compute[107505]: mps2-an385 Aug 30 14:01:13.841319 np0035104604 nova-compute[107505]: virt-6.0 Aug 30 14:01:13.841319 np0035104604 nova-compute[107505]: cubieboard Aug 30 14:01:13.841319 np0035104604 nova-compute[107505]: verdex Aug 30 14:01:13.841319 np0035104604 nova-compute[107505]: netduino2 Aug 30 14:01:13.843088 np0035104604 nova-compute[107505]: mps2-an386 Aug 30 14:01:13.843088 np0035104604 nova-compute[107505]: virt-6.1 Aug 30 14:01:13.843088 np0035104604 nova-compute[107505]: raspi2b Aug 30 14:01:13.843088 np0035104604 nova-compute[107505]: vexpress-a15 Aug 30 14:01:13.843088 np0035104604 nova-compute[107505]: fuji-bmc Aug 30 14:01:13.843088 np0035104604 nova-compute[107505]: virt-6.2 Aug 30 14:01:13.843088 np0035104604 nova-compute[107505]: virt Aug 30 14:01:13.843088 np0035104604 nova-compute[107505]: sonorapass-bmc Aug 30 14:01:13.843088 np0035104604 nova-compute[107505]: cheetah Aug 30 14:01:13.843088 np0035104604 nova-compute[107505]: virt-2.6 Aug 30 14:01:13.843088 np0035104604 nova-compute[107505]: ast2500-evb Aug 30 14:01:13.843088 np0035104604 nova-compute[107505]: highbank Aug 30 14:01:13.843088 np0035104604 nova-compute[107505]: akita Aug 30 14:01:13.843088 np0035104604 nova-compute[107505]: connex Aug 30 14:01:13.843088 np0035104604 nova-compute[107505]: netduinoplus2 Aug 30 14:01:13.843088 np0035104604 nova-compute[107505]: collie Aug 30 14:01:13.843088 np0035104604 nova-compute[107505]: raspi0 Aug 30 14:01:13.843088 np0035104604 nova-compute[107505]: fp5280g2-bmc Aug 30 14:01:13.843088 np0035104604 nova-compute[107505]: Aug 30 14:01:13.843088 np0035104604 nova-compute[107505]: Aug 30 14:01:13.843088 np0035104604 nova-compute[107505]: Aug 30 14:01:13.843088 np0035104604 nova-compute[107505]: Aug 30 14:01:13.843088 np0035104604 nova-compute[107505]: Aug 30 14:01:13.843088 np0035104604 nova-compute[107505]: Aug 30 14:01:13.843088 np0035104604 nova-compute[107505]: Aug 30 14:01:13.843088 np0035104604 nova-compute[107505]: Aug 30 14:01:13.843088 np0035104604 nova-compute[107505]: Aug 30 14:01:13.843088 np0035104604 nova-compute[107505]: hvm Aug 30 14:01:13.843088 np0035104604 nova-compute[107505]: Aug 30 14:01:13.843088 np0035104604 nova-compute[107505]: 64 Aug 30 14:01:13.843088 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-aarch64 Aug 30 14:01:13.843088 np0035104604 nova-compute[107505]: integratorcp Aug 30 14:01:13.843088 np0035104604 nova-compute[107505]: ast2600-evb Aug 30 14:01:13.843088 np0035104604 nova-compute[107505]: borzoi Aug 30 14:01:13.843088 np0035104604 nova-compute[107505]: spitz Aug 30 14:01:13.843088 np0035104604 nova-compute[107505]: virt-2.7 Aug 30 14:01:13.843088 np0035104604 nova-compute[107505]: nuri Aug 30 14:01:13.843088 np0035104604 nova-compute[107505]: mcimx7d-sabre Aug 30 14:01:13.843088 np0035104604 nova-compute[107505]: romulus-bmc Aug 30 14:01:13.843088 np0035104604 nova-compute[107505]: virt-3.0 Aug 30 14:01:13.843088 np0035104604 nova-compute[107505]: virt-5.0 Aug 30 14:01:13.843088 np0035104604 nova-compute[107505]: npcm750-evb Aug 30 14:01:13.843088 np0035104604 nova-compute[107505]: virt-2.10 Aug 30 14:01:13.843088 np0035104604 nova-compute[107505]: rainier-bmc Aug 30 14:01:13.843088 np0035104604 nova-compute[107505]: mps3-an547 Aug 30 14:01:13.843088 np0035104604 nova-compute[107505]: virt-2.8 Aug 30 14:01:13.843088 np0035104604 nova-compute[107505]: musca-b1 Aug 30 14:01:13.843088 np0035104604 nova-compute[107505]: realview-pbx-a9 Aug 30 14:01:13.843088 np0035104604 nova-compute[107505]: versatileab Aug 30 14:01:13.845700 np0035104604 nova-compute[107505]: kzm Aug 30 14:01:13.845700 np0035104604 nova-compute[107505]: musca-a Aug 30 14:01:13.845700 np0035104604 nova-compute[107505]: virt-3.1 Aug 30 14:01:13.845700 np0035104604 nova-compute[107505]: mcimx6ul-evk Aug 30 14:01:13.845700 np0035104604 nova-compute[107505]: virt-5.1 Aug 30 14:01:13.845700 np0035104604 nova-compute[107505]: smdkc210 Aug 30 14:01:13.845700 np0035104604 nova-compute[107505]: sx1 Aug 30 14:01:13.845700 np0035104604 nova-compute[107505]: virt-2.11 Aug 30 14:01:13.845700 np0035104604 nova-compute[107505]: imx25-pdk Aug 30 14:01:13.845700 np0035104604 nova-compute[107505]: stm32vldiscovery Aug 30 14:01:13.845700 np0035104604 nova-compute[107505]: virt-2.9 Aug 30 14:01:13.845700 np0035104604 nova-compute[107505]: orangepi-pc Aug 30 14:01:13.845700 np0035104604 nova-compute[107505]: quanta-q71l-bmc Aug 30 14:01:13.845700 np0035104604 nova-compute[107505]: z2 Aug 30 14:01:13.845700 np0035104604 nova-compute[107505]: virt-5.2 Aug 30 14:01:13.845700 np0035104604 nova-compute[107505]: xilinx-zynq-a9 Aug 30 14:01:13.845700 np0035104604 nova-compute[107505]: xlnx-zcu102 Aug 30 14:01:13.845700 np0035104604 nova-compute[107505]: tosa Aug 30 14:01:13.845700 np0035104604 nova-compute[107505]: mps2-an500 Aug 30 14:01:13.845700 np0035104604 nova-compute[107505]: virt-2.12 Aug 30 14:01:13.845700 np0035104604 nova-compute[107505]: mps2-an521 Aug 30 14:01:13.845700 np0035104604 nova-compute[107505]: sabrelite Aug 30 14:01:13.845700 np0035104604 nova-compute[107505]: mps2-an511 Aug 30 14:01:13.845700 np0035104604 nova-compute[107505]: canon-a1100 Aug 30 14:01:13.845700 np0035104604 nova-compute[107505]: realview-eb Aug 30 14:01:13.845700 np0035104604 nova-compute[107505]: quanta-gbs-bmc Aug 30 14:01:13.845700 np0035104604 nova-compute[107505]: emcraft-sf2 Aug 30 14:01:13.845700 np0035104604 nova-compute[107505]: realview-pb-a8 Aug 30 14:01:13.845700 np0035104604 nova-compute[107505]: sbsa-ref Aug 30 14:01:13.845700 np0035104604 nova-compute[107505]: virt-4.0 Aug 30 14:01:13.845700 np0035104604 nova-compute[107505]: raspi1ap Aug 30 14:01:13.845700 np0035104604 nova-compute[107505]: palmetto-bmc Aug 30 14:01:13.845700 np0035104604 nova-compute[107505]: sx1-v1 Aug 30 14:01:13.845700 np0035104604 nova-compute[107505]: n810 Aug 30 14:01:13.845700 np0035104604 nova-compute[107505]: g220a-bmc Aug 30 14:01:13.845700 np0035104604 nova-compute[107505]: n800 Aug 30 14:01:13.845700 np0035104604 nova-compute[107505]: tacoma-bmc Aug 30 14:01:13.845700 np0035104604 nova-compute[107505]: virt-4.1 Aug 30 14:01:13.845700 np0035104604 nova-compute[107505]: quanta-gsj Aug 30 14:01:13.845700 np0035104604 nova-compute[107505]: versatilepb Aug 30 14:01:13.845700 np0035104604 nova-compute[107505]: terrier Aug 30 14:01:13.845700 np0035104604 nova-compute[107505]: mainstone Aug 30 14:01:13.845700 np0035104604 nova-compute[107505]: realview-eb-mpcore Aug 30 14:01:13.848634 np0035104604 nova-compute[107505]: supermicrox11-bmc Aug 30 14:01:13.848634 np0035104604 nova-compute[107505]: virt-4.2 Aug 30 14:01:13.848634 np0035104604 nova-compute[107505]: witherspoon-bmc Aug 30 14:01:13.848634 np0035104604 nova-compute[107505]: mps3-an524 Aug 30 14:01:13.848634 np0035104604 nova-compute[107505]: swift-bmc Aug 30 14:01:13.848634 np0035104604 nova-compute[107505]: kudo-bmc Aug 30 14:01:13.848634 np0035104604 nova-compute[107505]: vexpress-a9 Aug 30 14:01:13.848634 np0035104604 nova-compute[107505]: midway Aug 30 14:01:13.848634 np0035104604 nova-compute[107505]: musicpal Aug 30 14:01:13.848634 np0035104604 nova-compute[107505]: lm3s811evb Aug 30 14:01:13.848634 np0035104604 nova-compute[107505]: lm3s6965evb Aug 30 14:01:13.848634 np0035104604 nova-compute[107505]: microbit Aug 30 14:01:13.848634 np0035104604 nova-compute[107505]: mps2-an505 Aug 30 14:01:13.848634 np0035104604 nova-compute[107505]: mps2-an385 Aug 30 14:01:13.848634 np0035104604 nova-compute[107505]: virt-6.0 Aug 30 14:01:13.848634 np0035104604 nova-compute[107505]: raspi3ap Aug 30 14:01:13.848634 np0035104604 nova-compute[107505]: cubieboard Aug 30 14:01:13.848634 np0035104604 nova-compute[107505]: verdex Aug 30 14:01:13.848634 np0035104604 nova-compute[107505]: netduino2 Aug 30 14:01:13.848634 np0035104604 nova-compute[107505]: xlnx-versal-virt Aug 30 14:01:13.848634 np0035104604 nova-compute[107505]: mps2-an386 Aug 30 14:01:13.848634 np0035104604 nova-compute[107505]: virt-6.1 Aug 30 14:01:13.848634 np0035104604 nova-compute[107505]: raspi3b Aug 30 14:01:13.848634 np0035104604 nova-compute[107505]: raspi2b Aug 30 14:01:13.848634 np0035104604 nova-compute[107505]: vexpress-a15 Aug 30 14:01:13.848634 np0035104604 nova-compute[107505]: fuji-bmc Aug 30 14:01:13.848634 np0035104604 nova-compute[107505]: virt-6.2 Aug 30 14:01:13.848634 np0035104604 nova-compute[107505]: virt Aug 30 14:01:13.848634 np0035104604 nova-compute[107505]: sonorapass-bmc Aug 30 14:01:13.848634 np0035104604 nova-compute[107505]: cheetah Aug 30 14:01:13.848634 np0035104604 nova-compute[107505]: virt-2.6 Aug 30 14:01:13.848634 np0035104604 nova-compute[107505]: ast2500-evb Aug 30 14:01:13.848634 np0035104604 nova-compute[107505]: highbank Aug 30 14:01:13.848634 np0035104604 nova-compute[107505]: akita Aug 30 14:01:13.848634 np0035104604 nova-compute[107505]: connex Aug 30 14:01:13.848634 np0035104604 nova-compute[107505]: netduinoplus2 Aug 30 14:01:13.848634 np0035104604 nova-compute[107505]: collie Aug 30 14:01:13.848634 np0035104604 nova-compute[107505]: raspi0 Aug 30 14:01:13.848634 np0035104604 nova-compute[107505]: fp5280g2-bmc Aug 30 14:01:13.848634 np0035104604 nova-compute[107505]: Aug 30 14:01:13.848634 np0035104604 nova-compute[107505]: Aug 30 14:01:13.848634 np0035104604 nova-compute[107505]: Aug 30 14:01:13.848634 np0035104604 nova-compute[107505]: Aug 30 14:01:13.848634 np0035104604 nova-compute[107505]: Aug 30 14:01:13.848634 np0035104604 nova-compute[107505]: Aug 30 14:01:13.852398 np0035104604 nova-compute[107505]: Aug 30 14:01:13.852398 np0035104604 nova-compute[107505]: Aug 30 14:01:13.852398 np0035104604 nova-compute[107505]: Aug 30 14:01:13.852398 np0035104604 nova-compute[107505]: Aug 30 14:01:13.852398 np0035104604 nova-compute[107505]: hvm Aug 30 14:01:13.852398 np0035104604 nova-compute[107505]: Aug 30 14:01:13.852398 np0035104604 nova-compute[107505]: 32 Aug 30 14:01:13.852398 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-cris Aug 30 14:01:13.852398 np0035104604 nova-compute[107505]: axis-dev88 Aug 30 14:01:13.852398 np0035104604 nova-compute[107505]: Aug 30 14:01:13.852398 np0035104604 nova-compute[107505]: Aug 30 14:01:13.852398 np0035104604 nova-compute[107505]: Aug 30 14:01:13.852398 np0035104604 nova-compute[107505]: Aug 30 14:01:13.852398 np0035104604 nova-compute[107505]: Aug 30 14:01:13.852398 np0035104604 nova-compute[107505]: Aug 30 14:01:13.852398 np0035104604 nova-compute[107505]: Aug 30 14:01:13.852398 np0035104604 nova-compute[107505]: Aug 30 14:01:13.852398 np0035104604 nova-compute[107505]: Aug 30 14:01:13.852398 np0035104604 nova-compute[107505]: hvm Aug 30 14:01:13.852398 np0035104604 nova-compute[107505]: Aug 30 14:01:13.852398 np0035104604 nova-compute[107505]: 32 Aug 30 14:01:13.852398 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-i386 Aug 30 14:01:13.852398 np0035104604 nova-compute[107505]: pc-i440fx-jammy Aug 30 14:01:13.852398 np0035104604 nova-compute[107505]: ubuntu Aug 30 14:01:13.852398 np0035104604 nova-compute[107505]: pc-i440fx-impish-hpb Aug 30 14:01:13.852398 np0035104604 nova-compute[107505]: pc-q35-5.2 Aug 30 14:01:13.852398 np0035104604 nova-compute[107505]: pc-i440fx-2.12 Aug 30 14:01:13.852398 np0035104604 nova-compute[107505]: pc-i440fx-2.0 Aug 30 14:01:13.852398 np0035104604 nova-compute[107505]: pc-i440fx-xenial Aug 30 14:01:13.852398 np0035104604 nova-compute[107505]: pc-i440fx-6.2 Aug 30 14:01:13.852398 np0035104604 nova-compute[107505]: pc Aug 30 14:01:13.852398 np0035104604 nova-compute[107505]: pc-q35-4.2 Aug 30 14:01:13.852398 np0035104604 nova-compute[107505]: pc-i440fx-2.5 Aug 30 14:01:13.852398 np0035104604 nova-compute[107505]: pc-i440fx-4.2 Aug 30 14:01:13.852398 np0035104604 nova-compute[107505]: pc-i440fx-focal Aug 30 14:01:13.852398 np0035104604 nova-compute[107505]: pc-i440fx-hirsute Aug 30 14:01:13.852398 np0035104604 nova-compute[107505]: pc-q35-xenial Aug 30 14:01:13.852398 np0035104604 nova-compute[107505]: pc-i440fx-jammy-hpb Aug 30 14:01:13.852398 np0035104604 nova-compute[107505]: pc-i440fx-5.2 Aug 30 14:01:13.852398 np0035104604 nova-compute[107505]: pc-i440fx-1.5 Aug 30 14:01:13.852398 np0035104604 nova-compute[107505]: pc-q35-2.7 Aug 30 14:01:13.852398 np0035104604 nova-compute[107505]: pc-q35-eoan-hpb Aug 30 14:01:13.852398 np0035104604 nova-compute[107505]: pc-i440fx-zesty Aug 30 14:01:13.852398 np0035104604 nova-compute[107505]: pc-i440fx-disco-hpb Aug 30 14:01:13.852398 np0035104604 nova-compute[107505]: pc-q35-groovy Aug 30 14:01:13.852398 np0035104604 nova-compute[107505]: pc-i440fx-groovy Aug 30 14:01:13.852398 np0035104604 nova-compute[107505]: pc-q35-artful Aug 30 14:01:13.852398 np0035104604 nova-compute[107505]: pc-i440fx-2.2 Aug 30 14:01:13.856343 np0035104604 nova-compute[107505]: pc-i440fx-trusty Aug 30 14:01:13.856343 np0035104604 nova-compute[107505]: pc-i440fx-eoan-hpb Aug 30 14:01:13.856343 np0035104604 nova-compute[107505]: pc-q35-focal-hpb Aug 30 14:01:13.856343 np0035104604 nova-compute[107505]: pc-q35-bionic-hpb Aug 30 14:01:13.856343 np0035104604 nova-compute[107505]: pc-i440fx-artful Aug 30 14:01:13.856343 np0035104604 nova-compute[107505]: pc-i440fx-2.7 Aug 30 14:01:13.856343 np0035104604 nova-compute[107505]: pc-q35-6.1 Aug 30 14:01:13.856343 np0035104604 nova-compute[107505]: pc-i440fx-yakkety Aug 30 14:01:13.856343 np0035104604 nova-compute[107505]: pc-q35-2.4 Aug 30 14:01:13.856343 np0035104604 nova-compute[107505]: pc-q35-cosmic-hpb Aug 30 14:01:13.856343 np0035104604 nova-compute[107505]: pc-q35-2.10 Aug 30 14:01:13.856343 np0035104604 nova-compute[107505]: x-remote Aug 30 14:01:13.856343 np0035104604 nova-compute[107505]: pc-q35-5.1 Aug 30 14:01:13.856343 np0035104604 nova-compute[107505]: pc-i440fx-1.7 Aug 30 14:01:13.856343 np0035104604 nova-compute[107505]: pc-q35-2.9 Aug 30 14:01:13.856343 np0035104604 nova-compute[107505]: pc-i440fx-2.11 Aug 30 14:01:13.856343 np0035104604 nova-compute[107505]: pc-q35-3.1 Aug 30 14:01:13.856343 np0035104604 nova-compute[107505]: pc-i440fx-6.1 Aug 30 14:01:13.856343 np0035104604 nova-compute[107505]: pc-q35-4.1 Aug 30 14:01:13.856343 np0035104604 nova-compute[107505]: pc-q35-jammy Aug 30 14:01:13.856343 np0035104604 nova-compute[107505]: ubuntu-q35 Aug 30 14:01:13.856343 np0035104604 nova-compute[107505]: pc-i440fx-2.4 Aug 30 14:01:13.856343 np0035104604 nova-compute[107505]: pc-i440fx-4.1 Aug 30 14:01:13.856343 np0035104604 nova-compute[107505]: pc-q35-eoan Aug 30 14:01:13.856343 np0035104604 nova-compute[107505]: pc-q35-jammy-hpb Aug 30 14:01:13.856343 np0035104604 nova-compute[107505]: pc-i440fx-5.1 Aug 30 14:01:13.856343 np0035104604 nova-compute[107505]: pc-i440fx-2.9 Aug 30 14:01:13.856343 np0035104604 nova-compute[107505]: pc-i440fx-bionic-hpb Aug 30 14:01:13.856343 np0035104604 nova-compute[107505]: isapc Aug 30 14:01:13.856343 np0035104604 nova-compute[107505]: pc-i440fx-1.4 Aug 30 14:01:13.856343 np0035104604 nova-compute[107505]: pc-q35-cosmic Aug 30 14:01:13.856343 np0035104604 nova-compute[107505]: pc-q35-2.6 Aug 30 14:01:13.856343 np0035104604 nova-compute[107505]: pc-i440fx-3.1 Aug 30 14:01:13.856343 np0035104604 nova-compute[107505]: pc-q35-bionic Aug 30 14:01:13.856343 np0035104604 nova-compute[107505]: pc-q35-disco-hpb Aug 30 14:01:13.856343 np0035104604 nova-compute[107505]: pc-i440fx-cosmic Aug 30 14:01:13.856343 np0035104604 nova-compute[107505]: pc-q35-2.12 Aug 30 14:01:13.856343 np0035104604 nova-compute[107505]: pc-i440fx-bionic Aug 30 14:01:13.859191 np0035104604 nova-compute[107505]: pc-q35-groovy-hpb Aug 30 14:01:13.859191 np0035104604 nova-compute[107505]: pc-q35-disco Aug 30 14:01:13.859191 np0035104604 nova-compute[107505]: pc-i440fx-cosmic-hpb Aug 30 14:01:13.859191 np0035104604 nova-compute[107505]: pc-i440fx-2.1 Aug 30 14:01:13.859191 np0035104604 nova-compute[107505]: pc-i440fx-wily Aug 30 14:01:13.859191 np0035104604 nova-compute[107505]: pc-q35-impish Aug 30 14:01:13.859191 np0035104604 nova-compute[107505]: pc-q35-6.0 Aug 30 14:01:13.859191 np0035104604 nova-compute[107505]: pc-i440fx-impish Aug 30 14:01:13.859191 np0035104604 nova-compute[107505]: pc-i440fx-2.6 Aug 30 14:01:13.859191 np0035104604 nova-compute[107505]: pc-q35-impish-hpb Aug 30 14:01:13.859191 np0035104604 nova-compute[107505]: pc-q35-hirsute Aug 30 14:01:13.859191 np0035104604 nova-compute[107505]: pc-q35-4.0.1 Aug 30 14:01:13.859191 np0035104604 nova-compute[107505]: pc-q35-hirsute-hpb Aug 30 14:01:13.859191 np0035104604 nova-compute[107505]: pc-i440fx-1.6 Aug 30 14:01:13.859191 np0035104604 nova-compute[107505]: pc-q35-5.0 Aug 30 14:01:13.859191 np0035104604 nova-compute[107505]: pc-q35-2.8 Aug 30 14:01:13.859191 np0035104604 nova-compute[107505]: pc-i440fx-2.10 Aug 30 14:01:13.859191 np0035104604 nova-compute[107505]: pc-q35-3.0 Aug 30 14:01:13.859191 np0035104604 nova-compute[107505]: pc-i440fx-6.0 Aug 30 14:01:13.859191 np0035104604 nova-compute[107505]: pc-q35-zesty Aug 30 14:01:13.859191 np0035104604 nova-compute[107505]: pc-q35-4.0 Aug 30 14:01:13.859191 np0035104604 nova-compute[107505]: pc-q35-focal Aug 30 14:01:13.859191 np0035104604 nova-compute[107505]: microvm Aug 30 14:01:13.859191 np0035104604 nova-compute[107505]: pc-i440fx-2.3 Aug 30 14:01:13.859191 np0035104604 nova-compute[107505]: pc-i440fx-focal-hpb Aug 30 14:01:13.859191 np0035104604 nova-compute[107505]: pc-i440fx-disco Aug 30 14:01:13.859191 np0035104604 nova-compute[107505]: pc-i440fx-4.0 Aug 30 14:01:13.859191 np0035104604 nova-compute[107505]: pc-i440fx-groovy-hpb Aug 30 14:01:13.859191 np0035104604 nova-compute[107505]: pc-i440fx-hirsute-hpb Aug 30 14:01:13.859191 np0035104604 nova-compute[107505]: pc-i440fx-5.0 Aug 30 14:01:13.859191 np0035104604 nova-compute[107505]: pc-q35-6.2 Aug 30 14:01:13.859191 np0035104604 nova-compute[107505]: q35 Aug 30 14:01:13.859191 np0035104604 nova-compute[107505]: pc-i440fx-2.8 Aug 30 14:01:13.859191 np0035104604 nova-compute[107505]: pc-i440fx-eoan Aug 30 14:01:13.859191 np0035104604 nova-compute[107505]: pc-q35-2.5 Aug 30 14:01:13.859191 np0035104604 nova-compute[107505]: pc-i440fx-3.0 Aug 30 14:01:13.859191 np0035104604 nova-compute[107505]: pc-q35-yakkety Aug 30 14:01:13.859191 np0035104604 nova-compute[107505]: pc-q35-2.11 Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: hvm Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: 32 Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-m68k Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: mcf5208evb Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: an5206 Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: virt-6.0 Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: q800 Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: virt-6.2 Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: virt Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: next-cube Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: virt-6.1 Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: hvm Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: 32 Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-microblaze Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: petalogix-s3adsp1800 Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: petalogix-ml605 Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: xlnx-zynqmp-pmu Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: hvm Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: 32 Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-microblazeel Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: petalogix-s3adsp1800 Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: petalogix-ml605 Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: xlnx-zynqmp-pmu Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: hvm Aug 30 14:01:13.862049 np0035104604 nova-compute[107505]: Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: 32 Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-mips Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: malta Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: mipssim Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: hvm Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: 32 Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-mipsel Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: malta Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: mipssim Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: hvm Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: 64 Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-mips64 Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: malta Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: mipssim Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: pica61 Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: magnum Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: hvm Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: 64 Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-mips64el Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: malta Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: loongson3-virt Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: mipssim Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: pica61 Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: magnum Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: boston Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: fuloong2e Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: hvm Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: 32 Aug 30 14:01:13.868005 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-ppc Aug 30 14:01:13.873697 np0035104604 nova-compute[107505]: g3beige Aug 30 14:01:13.873697 np0035104604 nova-compute[107505]: virtex-ml507 Aug 30 14:01:13.873697 np0035104604 nova-compute[107505]: mac99 Aug 30 14:01:13.873697 np0035104604 nova-compute[107505]: ppce500 Aug 30 14:01:13.873697 np0035104604 nova-compute[107505]: pegasos2 Aug 30 14:01:13.873697 np0035104604 nova-compute[107505]: sam460ex Aug 30 14:01:13.873697 np0035104604 nova-compute[107505]: bamboo Aug 30 14:01:13.873697 np0035104604 nova-compute[107505]: 40p Aug 30 14:01:13.873697 np0035104604 nova-compute[107505]: ref405ep Aug 30 14:01:13.873697 np0035104604 nova-compute[107505]: mpc8544ds Aug 30 14:01:13.873697 np0035104604 nova-compute[107505]: taihu Aug 30 14:01:13.873697 np0035104604 nova-compute[107505]: Aug 30 14:01:13.873697 np0035104604 nova-compute[107505]: Aug 30 14:01:13.873697 np0035104604 nova-compute[107505]: Aug 30 14:01:13.873697 np0035104604 nova-compute[107505]: Aug 30 14:01:13.873697 np0035104604 nova-compute[107505]: Aug 30 14:01:13.873697 np0035104604 nova-compute[107505]: Aug 30 14:01:13.873697 np0035104604 nova-compute[107505]: Aug 30 14:01:13.873697 np0035104604 nova-compute[107505]: Aug 30 14:01:13.873697 np0035104604 nova-compute[107505]: Aug 30 14:01:13.873697 np0035104604 nova-compute[107505]: hvm Aug 30 14:01:13.873697 np0035104604 nova-compute[107505]: Aug 30 14:01:13.873697 np0035104604 nova-compute[107505]: 64 Aug 30 14:01:13.873697 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-ppc64 Aug 30 14:01:13.873697 np0035104604 nova-compute[107505]: pseries-jammy Aug 30 14:01:13.873697 np0035104604 nova-compute[107505]: pseries Aug 30 14:01:13.873697 np0035104604 nova-compute[107505]: powernv9 Aug 30 14:01:13.873697 np0035104604 nova-compute[107505]: powernv Aug 30 14:01:13.873697 np0035104604 nova-compute[107505]: taihu Aug 30 14:01:13.873697 np0035104604 nova-compute[107505]: pseries-4.1 Aug 30 14:01:13.873697 np0035104604 nova-compute[107505]: mpc8544ds Aug 30 14:01:13.873697 np0035104604 nova-compute[107505]: pseries-6.1 Aug 30 14:01:13.873697 np0035104604 nova-compute[107505]: pseries-2.5 Aug 30 14:01:13.873697 np0035104604 nova-compute[107505]: powernv10 Aug 30 14:01:13.873697 np0035104604 nova-compute[107505]: pseries-xenial Aug 30 14:01:13.873697 np0035104604 nova-compute[107505]: pseries-4.2 Aug 30 14:01:13.873697 np0035104604 nova-compute[107505]: pseries-6.2 Aug 30 14:01:13.873697 np0035104604 nova-compute[107505]: pseries-yakkety Aug 30 14:01:13.873697 np0035104604 nova-compute[107505]: pseries-2.6 Aug 30 14:01:13.873697 np0035104604 nova-compute[107505]: ppce500 Aug 30 14:01:13.873697 np0035104604 nova-compute[107505]: pseries-bionic-sxxm Aug 30 14:01:13.873697 np0035104604 nova-compute[107505]: pseries-2.7 Aug 30 14:01:13.873697 np0035104604 nova-compute[107505]: pseries-3.0 Aug 30 14:01:13.873697 np0035104604 nova-compute[107505]: pseries-5.0 Aug 30 14:01:13.873697 np0035104604 nova-compute[107505]: 40p Aug 30 14:01:13.877632 np0035104604 nova-compute[107505]: pseries-2.8 Aug 30 14:01:13.877632 np0035104604 nova-compute[107505]: pegasos2 Aug 30 14:01:13.877632 np0035104604 nova-compute[107505]: pseries-hirsute Aug 30 14:01:13.877632 np0035104604 nova-compute[107505]: pseries-3.1 Aug 30 14:01:13.877632 np0035104604 nova-compute[107505]: pseries-5.1 Aug 30 14:01:13.877632 np0035104604 nova-compute[107505]: pseries-eoan Aug 30 14:01:13.877632 np0035104604 nova-compute[107505]: pseries-2.9 Aug 30 14:01:13.877632 np0035104604 nova-compute[107505]: pseries-zesty Aug 30 14:01:13.877632 np0035104604 nova-compute[107505]: bamboo Aug 30 14:01:13.877632 np0035104604 nova-compute[107505]: pseries-groovy Aug 30 14:01:13.877632 np0035104604 nova-compute[107505]: pseries-focal Aug 30 14:01:13.877632 np0035104604 nova-compute[107505]: g3beige Aug 30 14:01:13.877632 np0035104604 nova-compute[107505]: pseries-5.2 Aug 30 14:01:13.877632 np0035104604 nova-compute[107505]: pseries-disco Aug 30 14:01:13.877632 np0035104604 nova-compute[107505]: pseries-2.12-sxxm Aug 30 14:01:13.877632 np0035104604 nova-compute[107505]: pseries-2.10 Aug 30 14:01:13.877632 np0035104604 nova-compute[107505]: virtex-ml507 Aug 30 14:01:13.877632 np0035104604 nova-compute[107505]: pseries-2.11 Aug 30 14:01:13.877632 np0035104604 nova-compute[107505]: pseries-2.1 Aug 30 14:01:13.877632 np0035104604 nova-compute[107505]: pseries-cosmic Aug 30 14:01:13.877632 np0035104604 nova-compute[107505]: pseries-bionic Aug 30 14:01:13.877632 np0035104604 nova-compute[107505]: pseries-2.12 Aug 30 14:01:13.877632 np0035104604 nova-compute[107505]: pseries-2.2 Aug 30 14:01:13.877632 np0035104604 nova-compute[107505]: mac99 Aug 30 14:01:13.877632 np0035104604 nova-compute[107505]: pseries-impish Aug 30 14:01:13.877632 np0035104604 nova-compute[107505]: pseries-artful Aug 30 14:01:13.877632 np0035104604 nova-compute[107505]: sam460ex Aug 30 14:01:13.877632 np0035104604 nova-compute[107505]: ref405ep Aug 30 14:01:13.877632 np0035104604 nova-compute[107505]: pseries-2.3 Aug 30 14:01:13.877632 np0035104604 nova-compute[107505]: powernv8 Aug 30 14:01:13.877632 np0035104604 nova-compute[107505]: pseries-4.0 Aug 30 14:01:13.877632 np0035104604 nova-compute[107505]: pseries-6.0 Aug 30 14:01:13.877632 np0035104604 nova-compute[107505]: pseries-2.4 Aug 30 14:01:13.877632 np0035104604 nova-compute[107505]: Aug 30 14:01:13.877632 np0035104604 nova-compute[107505]: Aug 30 14:01:13.877632 np0035104604 nova-compute[107505]: Aug 30 14:01:13.877632 np0035104604 nova-compute[107505]: Aug 30 14:01:13.877632 np0035104604 nova-compute[107505]: Aug 30 14:01:13.877632 np0035104604 nova-compute[107505]: Aug 30 14:01:13.877632 np0035104604 nova-compute[107505]: Aug 30 14:01:13.877632 np0035104604 nova-compute[107505]: Aug 30 14:01:13.877632 np0035104604 nova-compute[107505]: Aug 30 14:01:13.880775 np0035104604 nova-compute[107505]: hvm Aug 30 14:01:13.880775 np0035104604 nova-compute[107505]: Aug 30 14:01:13.880775 np0035104604 nova-compute[107505]: 64 Aug 30 14:01:13.880775 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-ppc64le Aug 30 14:01:13.880775 np0035104604 nova-compute[107505]: pseries-jammy Aug 30 14:01:13.880775 np0035104604 nova-compute[107505]: pseries Aug 30 14:01:13.880775 np0035104604 nova-compute[107505]: powernv9 Aug 30 14:01:13.880775 np0035104604 nova-compute[107505]: powernv Aug 30 14:01:13.880775 np0035104604 nova-compute[107505]: taihu Aug 30 14:01:13.880775 np0035104604 nova-compute[107505]: pseries-4.1 Aug 30 14:01:13.880775 np0035104604 nova-compute[107505]: mpc8544ds Aug 30 14:01:13.880775 np0035104604 nova-compute[107505]: pseries-6.1 Aug 30 14:01:13.880775 np0035104604 nova-compute[107505]: pseries-2.5 Aug 30 14:01:13.880775 np0035104604 nova-compute[107505]: powernv10 Aug 30 14:01:13.880775 np0035104604 nova-compute[107505]: pseries-xenial Aug 30 14:01:13.880775 np0035104604 nova-compute[107505]: pseries-4.2 Aug 30 14:01:13.880775 np0035104604 nova-compute[107505]: pseries-6.2 Aug 30 14:01:13.880775 np0035104604 nova-compute[107505]: pseries-yakkety Aug 30 14:01:13.880775 np0035104604 nova-compute[107505]: pseries-2.6 Aug 30 14:01:13.880775 np0035104604 nova-compute[107505]: ppce500 Aug 30 14:01:13.880775 np0035104604 nova-compute[107505]: pseries-bionic-sxxm Aug 30 14:01:13.880775 np0035104604 nova-compute[107505]: pseries-2.7 Aug 30 14:01:13.880775 np0035104604 nova-compute[107505]: pseries-3.0 Aug 30 14:01:13.880775 np0035104604 nova-compute[107505]: pseries-5.0 Aug 30 14:01:13.880775 np0035104604 nova-compute[107505]: 40p Aug 30 14:01:13.880775 np0035104604 nova-compute[107505]: pseries-2.8 Aug 30 14:01:13.880775 np0035104604 nova-compute[107505]: pegasos2 Aug 30 14:01:13.880775 np0035104604 nova-compute[107505]: pseries-hirsute Aug 30 14:01:13.880775 np0035104604 nova-compute[107505]: pseries-3.1 Aug 30 14:01:13.880775 np0035104604 nova-compute[107505]: pseries-5.1 Aug 30 14:01:13.880775 np0035104604 nova-compute[107505]: pseries-eoan Aug 30 14:01:13.880775 np0035104604 nova-compute[107505]: pseries-2.9 Aug 30 14:01:13.880775 np0035104604 nova-compute[107505]: pseries-zesty Aug 30 14:01:13.880775 np0035104604 nova-compute[107505]: bamboo Aug 30 14:01:13.880775 np0035104604 nova-compute[107505]: pseries-groovy Aug 30 14:01:13.880775 np0035104604 nova-compute[107505]: pseries-focal Aug 30 14:01:13.880775 np0035104604 nova-compute[107505]: g3beige Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: pseries-5.2 Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: pseries-disco Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: pseries-2.12-sxxm Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: pseries-2.10 Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: virtex-ml507 Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: pseries-2.11 Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: pseries-2.1 Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: pseries-cosmic Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: pseries-bionic Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: pseries-2.12 Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: pseries-2.2 Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: mac99 Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: pseries-impish Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: pseries-artful Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: sam460ex Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: ref405ep Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: pseries-2.3 Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: powernv8 Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: pseries-4.0 Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: pseries-6.0 Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: pseries-2.4 Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: hvm Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: 32 Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-riscv32 Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: spike Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: opentitan Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: sifive_u Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: sifive_e Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: virt Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: hvm Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: 64 Aug 30 14:01:13.883737 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-riscv64 Aug 30 14:01:13.887933 np0035104604 nova-compute[107505]: spike Aug 30 14:01:13.887933 np0035104604 nova-compute[107505]: microchip-icicle-kit Aug 30 14:01:13.887933 np0035104604 nova-compute[107505]: sifive_u Aug 30 14:01:13.887933 np0035104604 nova-compute[107505]: shakti_c Aug 30 14:01:13.887933 np0035104604 nova-compute[107505]: sifive_e Aug 30 14:01:13.887933 np0035104604 nova-compute[107505]: virt Aug 30 14:01:13.887933 np0035104604 nova-compute[107505]: Aug 30 14:01:13.887933 np0035104604 nova-compute[107505]: Aug 30 14:01:13.887933 np0035104604 nova-compute[107505]: Aug 30 14:01:13.887933 np0035104604 nova-compute[107505]: Aug 30 14:01:13.887933 np0035104604 nova-compute[107505]: Aug 30 14:01:13.887933 np0035104604 nova-compute[107505]: Aug 30 14:01:13.887933 np0035104604 nova-compute[107505]: Aug 30 14:01:13.887933 np0035104604 nova-compute[107505]: Aug 30 14:01:13.887933 np0035104604 nova-compute[107505]: Aug 30 14:01:13.887933 np0035104604 nova-compute[107505]: hvm Aug 30 14:01:13.887933 np0035104604 nova-compute[107505]: Aug 30 14:01:13.887933 np0035104604 nova-compute[107505]: 64 Aug 30 14:01:13.887933 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-s390x Aug 30 14:01:13.887933 np0035104604 nova-compute[107505]: s390-ccw-virtio-jammy Aug 30 14:01:13.887933 np0035104604 nova-compute[107505]: s390-ccw-virtio Aug 30 14:01:13.887933 np0035104604 nova-compute[107505]: s390-ccw-virtio-4.0 Aug 30 14:01:13.887933 np0035104604 nova-compute[107505]: s390-ccw-virtio-5.2 Aug 30 14:01:13.887933 np0035104604 nova-compute[107505]: s390-ccw-virtio-artful Aug 30 14:01:13.887933 np0035104604 nova-compute[107505]: s390-ccw-virtio-3.1 Aug 30 14:01:13.887933 np0035104604 nova-compute[107505]: s390-ccw-virtio-groovy Aug 30 14:01:13.887933 np0035104604 nova-compute[107505]: s390-ccw-virtio-hirsute Aug 30 14:01:13.887933 np0035104604 nova-compute[107505]: s390-ccw-virtio-disco Aug 30 14:01:13.887933 np0035104604 nova-compute[107505]: s390-ccw-virtio-2.12 Aug 30 14:01:13.887933 np0035104604 nova-compute[107505]: s390-ccw-virtio-2.6 Aug 30 14:01:13.887933 np0035104604 nova-compute[107505]: s390-ccw-virtio-yakkety Aug 30 14:01:13.887933 np0035104604 nova-compute[107505]: s390-ccw-virtio-eoan Aug 30 14:01:13.887933 np0035104604 nova-compute[107505]: s390-ccw-virtio-2.9 Aug 30 14:01:13.887933 np0035104604 nova-compute[107505]: s390-ccw-virtio-6.0 Aug 30 14:01:13.887933 np0035104604 nova-compute[107505]: s390-ccw-virtio-5.1 Aug 30 14:01:13.887933 np0035104604 nova-compute[107505]: s390-ccw-virtio-3.0 Aug 30 14:01:13.887933 np0035104604 nova-compute[107505]: s390-ccw-virtio-4.2 Aug 30 14:01:13.887933 np0035104604 nova-compute[107505]: s390-ccw-virtio-2.5 Aug 30 14:01:13.887933 np0035104604 nova-compute[107505]: s390-ccw-virtio-2.11 Aug 30 14:01:13.887933 np0035104604 nova-compute[107505]: s390-ccw-virtio-xenial Aug 30 14:01:13.887933 np0035104604 nova-compute[107505]: s390-ccw-virtio-focal Aug 30 14:01:13.887933 np0035104604 nova-compute[107505]: s390-ccw-virtio-2.8 Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: s390-ccw-virtio-impish Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: s390-ccw-virtio-bionic Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: s390-ccw-virtio-5.0 Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: s390-ccw-virtio-6.2 Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: s390-ccw-virtio-zesty Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: s390-ccw-virtio-4.1 Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: s390-ccw-virtio-cosmic Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: s390-ccw-virtio-2.4 Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: s390-ccw-virtio-2.10 Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: s390-ccw-virtio-2.7 Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: s390-ccw-virtio-6.1 Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: hvm Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: 32 Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-sh4 Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: shix Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: r2d Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: hvm Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: 64 Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-sh4eb Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: shix Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: r2d Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: hvm Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: 32 Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-sparc Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: SS-5 Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: SS-20 Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: LX Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: SPARCClassic Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: leon3_generic Aug 30 14:01:13.891851 np0035104604 nova-compute[107505]: SPARCbook Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: SS-4 Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: SS-600MP Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: SS-10 Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: Voyager Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: hvm Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: 64 Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-sparc64 Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: sun4u Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: niagara Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: sun4v Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: hvm Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: 64 Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-x86_64 Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: pc-i440fx-jammy Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: ubuntu Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: pc-i440fx-impish-hpb Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: pc-q35-5.2 Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: pc-i440fx-2.12 Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: pc-i440fx-2.0 Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: pc-i440fx-xenial Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: pc-i440fx-6.2 Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: pc Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: pc-q35-4.2 Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: pc-i440fx-2.5 Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: pc-i440fx-4.2 Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: pc-i440fx-hirsute Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: pc-i440fx-focal Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: pc-q35-xenial Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: pc-i440fx-jammy-hpb Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: pc-i440fx-5.2 Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: pc-i440fx-1.5 Aug 30 14:01:13.896770 np0035104604 nova-compute[107505]: pc-q35-2.7 Aug 30 14:01:13.900680 np0035104604 nova-compute[107505]: pc-q35-eoan-hpb Aug 30 14:01:13.900680 np0035104604 nova-compute[107505]: pc-i440fx-zesty Aug 30 14:01:13.900680 np0035104604 nova-compute[107505]: pc-i440fx-disco-hpb Aug 30 14:01:13.900680 np0035104604 nova-compute[107505]: pc-q35-groovy Aug 30 14:01:13.900680 np0035104604 nova-compute[107505]: pc-i440fx-groovy Aug 30 14:01:13.900680 np0035104604 nova-compute[107505]: pc-q35-artful Aug 30 14:01:13.900680 np0035104604 nova-compute[107505]: pc-i440fx-trusty Aug 30 14:01:13.900680 np0035104604 nova-compute[107505]: pc-i440fx-2.2 Aug 30 14:01:13.900680 np0035104604 nova-compute[107505]: pc-q35-focal-hpb Aug 30 14:01:13.900680 np0035104604 nova-compute[107505]: pc-i440fx-eoan-hpb Aug 30 14:01:13.900680 np0035104604 nova-compute[107505]: pc-q35-bionic-hpb Aug 30 14:01:13.900680 np0035104604 nova-compute[107505]: pc-i440fx-artful Aug 30 14:01:13.900680 np0035104604 nova-compute[107505]: pc-i440fx-2.7 Aug 30 14:01:13.900680 np0035104604 nova-compute[107505]: pc-q35-6.1 Aug 30 14:01:13.900680 np0035104604 nova-compute[107505]: pc-i440fx-yakkety Aug 30 14:01:13.900680 np0035104604 nova-compute[107505]: pc-q35-2.4 Aug 30 14:01:13.900680 np0035104604 nova-compute[107505]: pc-q35-cosmic-hpb Aug 30 14:01:13.900680 np0035104604 nova-compute[107505]: pc-q35-2.10 Aug 30 14:01:13.900680 np0035104604 nova-compute[107505]: x-remote Aug 30 14:01:13.900680 np0035104604 nova-compute[107505]: pc-q35-5.1 Aug 30 14:01:13.900680 np0035104604 nova-compute[107505]: pc-i440fx-1.7 Aug 30 14:01:13.900680 np0035104604 nova-compute[107505]: pc-q35-2.9 Aug 30 14:01:13.900680 np0035104604 nova-compute[107505]: pc-i440fx-2.11 Aug 30 14:01:13.900680 np0035104604 nova-compute[107505]: pc-q35-3.1 Aug 30 14:01:13.900680 np0035104604 nova-compute[107505]: pc-i440fx-6.1 Aug 30 14:01:13.900680 np0035104604 nova-compute[107505]: pc-q35-4.1 Aug 30 14:01:13.900680 np0035104604 nova-compute[107505]: pc-q35-jammy Aug 30 14:01:13.900680 np0035104604 nova-compute[107505]: ubuntu-q35 Aug 30 14:01:13.900680 np0035104604 nova-compute[107505]: pc-i440fx-2.4 Aug 30 14:01:13.900680 np0035104604 nova-compute[107505]: pc-i440fx-4.1 Aug 30 14:01:13.900680 np0035104604 nova-compute[107505]: pc-q35-eoan Aug 30 14:01:13.900680 np0035104604 nova-compute[107505]: pc-q35-jammy-hpb Aug 30 14:01:13.900680 np0035104604 nova-compute[107505]: pc-i440fx-5.1 Aug 30 14:01:13.900680 np0035104604 nova-compute[107505]: pc-i440fx-2.9 Aug 30 14:01:13.900680 np0035104604 nova-compute[107505]: pc-i440fx-bionic-hpb Aug 30 14:01:13.900680 np0035104604 nova-compute[107505]: isapc Aug 30 14:01:13.900680 np0035104604 nova-compute[107505]: pc-i440fx-1.4 Aug 30 14:01:13.900680 np0035104604 nova-compute[107505]: pc-q35-cosmic Aug 30 14:01:13.904344 np0035104604 nova-compute[107505]: pc-q35-2.6 Aug 30 14:01:13.904344 np0035104604 nova-compute[107505]: pc-i440fx-3.1 Aug 30 14:01:13.904344 np0035104604 nova-compute[107505]: pc-q35-bionic Aug 30 14:01:13.904344 np0035104604 nova-compute[107505]: pc-q35-disco-hpb Aug 30 14:01:13.904344 np0035104604 nova-compute[107505]: pc-i440fx-cosmic Aug 30 14:01:13.904344 np0035104604 nova-compute[107505]: pc-q35-2.12 Aug 30 14:01:13.904344 np0035104604 nova-compute[107505]: pc-i440fx-bionic Aug 30 14:01:13.904344 np0035104604 nova-compute[107505]: pc-q35-groovy-hpb Aug 30 14:01:13.904344 np0035104604 nova-compute[107505]: pc-q35-disco Aug 30 14:01:13.904344 np0035104604 nova-compute[107505]: pc-i440fx-cosmic-hpb Aug 30 14:01:13.904344 np0035104604 nova-compute[107505]: pc-i440fx-2.1 Aug 30 14:01:13.904344 np0035104604 nova-compute[107505]: pc-i440fx-wily Aug 30 14:01:13.904344 np0035104604 nova-compute[107505]: pc-q35-impish Aug 30 14:01:13.904344 np0035104604 nova-compute[107505]: pc-i440fx-2.6 Aug 30 14:01:13.904344 np0035104604 nova-compute[107505]: pc-q35-6.0 Aug 30 14:01:13.904344 np0035104604 nova-compute[107505]: pc-i440fx-impish Aug 30 14:01:13.904344 np0035104604 nova-compute[107505]: pc-q35-impish-hpb Aug 30 14:01:13.904344 np0035104604 nova-compute[107505]: pc-q35-hirsute Aug 30 14:01:13.904344 np0035104604 nova-compute[107505]: pc-q35-4.0.1 Aug 30 14:01:13.904344 np0035104604 nova-compute[107505]: pc-q35-hirsute-hpb Aug 30 14:01:13.904344 np0035104604 nova-compute[107505]: pc-i440fx-1.6 Aug 30 14:01:13.904344 np0035104604 nova-compute[107505]: pc-q35-5.0 Aug 30 14:01:13.904344 np0035104604 nova-compute[107505]: pc-q35-2.8 Aug 30 14:01:13.904344 np0035104604 nova-compute[107505]: pc-i440fx-2.10 Aug 30 14:01:13.904344 np0035104604 nova-compute[107505]: pc-q35-3.0 Aug 30 14:01:13.904344 np0035104604 nova-compute[107505]: pc-q35-zesty Aug 30 14:01:13.904344 np0035104604 nova-compute[107505]: pc-q35-4.0 Aug 30 14:01:13.904344 np0035104604 nova-compute[107505]: pc-q35-focal Aug 30 14:01:13.904344 np0035104604 nova-compute[107505]: microvm Aug 30 14:01:13.904344 np0035104604 nova-compute[107505]: pc-i440fx-6.0 Aug 30 14:01:13.904344 np0035104604 nova-compute[107505]: pc-i440fx-2.3 Aug 30 14:01:13.904344 np0035104604 nova-compute[107505]: pc-i440fx-disco Aug 30 14:01:13.904344 np0035104604 nova-compute[107505]: pc-i440fx-focal-hpb Aug 30 14:01:13.904344 np0035104604 nova-compute[107505]: pc-i440fx-4.0 Aug 30 14:01:13.904344 np0035104604 nova-compute[107505]: pc-i440fx-groovy-hpb Aug 30 14:01:13.904344 np0035104604 nova-compute[107505]: pc-i440fx-hirsute-hpb Aug 30 14:01:13.904344 np0035104604 nova-compute[107505]: pc-i440fx-5.0 Aug 30 14:01:13.904344 np0035104604 nova-compute[107505]: pc-i440fx-2.8 Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: pc-q35-6.2 Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: q35 Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: pc-i440fx-eoan Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: pc-q35-2.5 Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: pc-i440fx-3.0 Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: pc-q35-yakkety Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: pc-q35-2.11 Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: hvm Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: 32 Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-xtensa Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: sim Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: kc705 Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: ml605 Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: ml605-nommu Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: virt Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: lx60-nommu Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: lx200 Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: lx200-nommu Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: lx60 Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: kc705-nommu Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: hvm Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: 32 Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-xtensaeb Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: sim Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: kc705 Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: ml605 Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: ml605-nommu Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: virt Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: lx60-nommu Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: lx200 Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: lx200-nommu Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: lx60 Aug 30 14:01:13.907654 np0035104604 nova-compute[107505]: kc705-nommu Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Getting domain capabilities for alpha via machine types: {None} {{(pid=107505) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Libvirt host hypervisor capabilities for arch=alpha and machine_type=None: Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-alpha Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: qemu Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: clipper Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: alpha Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: rom Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: pflash Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: yes Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: no Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: no Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: file Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: anonymous Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: memfd Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: disk Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: cdrom Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: floppy Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: lun Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: fdc Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: usb Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: sata Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: Aug 30 14:01:13.912215 np0035104604 nova-compute[107505]: Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: sdl Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: vnc Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: spice Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: egl-headless Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: subsystem Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: default Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: mandatory Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: requisite Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: optional Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: usb Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: pci Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: random Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: egd Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: builtin Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: path Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: handle Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: virtiofs Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: Aug 30 14:01:13.917380 np0035104604 nova-compute[107505]: {{(pid=107505) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Aug 30 14:01:13.922960 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Getting domain capabilities for armv6l via machine types: {None, 'virt'} {{(pid=107505) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Aug 30 14:01:13.922960 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Libvirt host hypervisor capabilities for arch=armv6l and machine_type=None: Aug 30 14:01:13.922960 np0035104604 nova-compute[107505]: Aug 30 14:01:13.922960 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-arm Aug 30 14:01:13.922960 np0035104604 nova-compute[107505]: qemu Aug 30 14:01:13.922960 np0035104604 nova-compute[107505]: integratorcp Aug 30 14:01:13.922960 np0035104604 nova-compute[107505]: armv6l Aug 30 14:01:13.922960 np0035104604 nova-compute[107505]: Aug 30 14:01:13.922960 np0035104604 nova-compute[107505]: Aug 30 14:01:13.922960 np0035104604 nova-compute[107505]: Aug 30 14:01:13.922960 np0035104604 nova-compute[107505]: Aug 30 14:01:13.922960 np0035104604 nova-compute[107505]: Aug 30 14:01:13.922960 np0035104604 nova-compute[107505]: Aug 30 14:01:13.922960 np0035104604 nova-compute[107505]: rom Aug 30 14:01:13.922960 np0035104604 nova-compute[107505]: pflash Aug 30 14:01:13.922960 np0035104604 nova-compute[107505]: Aug 30 14:01:13.922960 np0035104604 nova-compute[107505]: Aug 30 14:01:13.922960 np0035104604 nova-compute[107505]: yes Aug 30 14:01:13.922960 np0035104604 nova-compute[107505]: no Aug 30 14:01:13.922960 np0035104604 nova-compute[107505]: Aug 30 14:01:13.922960 np0035104604 nova-compute[107505]: Aug 30 14:01:13.922960 np0035104604 nova-compute[107505]: no Aug 30 14:01:13.922960 np0035104604 nova-compute[107505]: Aug 30 14:01:13.922960 np0035104604 nova-compute[107505]: Aug 30 14:01:13.922960 np0035104604 nova-compute[107505]: Aug 30 14:01:13.922960 np0035104604 nova-compute[107505]: Aug 30 14:01:13.922960 np0035104604 nova-compute[107505]: Aug 30 14:01:13.922960 np0035104604 nova-compute[107505]: Aug 30 14:01:13.922960 np0035104604 nova-compute[107505]: Aug 30 14:01:13.922960 np0035104604 nova-compute[107505]: on Aug 30 14:01:13.922960 np0035104604 nova-compute[107505]: off Aug 30 14:01:13.922960 np0035104604 nova-compute[107505]: Aug 30 14:01:13.922960 np0035104604 nova-compute[107505]: Aug 30 14:01:13.922960 np0035104604 nova-compute[107505]: Aug 30 14:01:13.922960 np0035104604 nova-compute[107505]: Aug 30 14:01:13.922960 np0035104604 nova-compute[107505]: pxa270-c0 Aug 30 14:01:13.922960 np0035104604 nova-compute[107505]: cortex-a15 Aug 30 14:01:13.922960 np0035104604 nova-compute[107505]: pxa270-b0 Aug 30 14:01:13.922960 np0035104604 nova-compute[107505]: cortex-m4 Aug 30 14:01:13.922960 np0035104604 nova-compute[107505]: pxa270-a0 Aug 30 14:01:13.922960 np0035104604 nova-compute[107505]: arm1176 Aug 30 14:01:13.922960 np0035104604 nova-compute[107505]: pxa270-b1 Aug 30 14:01:13.922960 np0035104604 nova-compute[107505]: cortex-a7 Aug 30 14:01:13.922960 np0035104604 nova-compute[107505]: pxa270-a1 Aug 30 14:01:13.922960 np0035104604 nova-compute[107505]: cortex-a8 Aug 30 14:01:13.922960 np0035104604 nova-compute[107505]: cortex-r5 Aug 30 14:01:13.922960 np0035104604 nova-compute[107505]: ti925t Aug 30 14:01:13.922960 np0035104604 nova-compute[107505]: cortex-r5f Aug 30 14:01:13.922960 np0035104604 nova-compute[107505]: arm1026 Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: cortex-a9 Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: cortex-m7 Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: pxa270 Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: pxa260 Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: pxa250 Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: pxa270-c5 Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: pxa261 Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: pxa262 Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: sa1110 Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: sa1100 Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: max Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: cortex-m0 Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: cortex-m33 Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: arm946 Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: pxa255 Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: arm11mpcore Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: cortex-m55 Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: arm926 Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: arm1136 Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: arm1136-r2 Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: cortex-m3 Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: file Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: anonymous Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: memfd Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: disk Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: cdrom Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: floppy Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: lun Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: fdc Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: usb Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: sata Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: sdl Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: vnc Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: spice Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: egl-headless Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: Aug 30 14:01:13.925400 np0035104604 nova-compute[107505]: Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: subsystem Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: default Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: mandatory Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: requisite Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: optional Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: usb Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: pci Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: random Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: egd Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: builtin Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: path Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: handle Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: virtiofs Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: tpm-tis Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: passthrough Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: {{(pid=107505) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Libvirt host hypervisor capabilities for arch=armv6l and machine_type=virt: Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-arm Aug 30 14:01:13.928479 np0035104604 nova-compute[107505]: qemu Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: virt-6.2 Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: armv6l Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: rom Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: pflash Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: yes Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: no Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: no Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: on Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: off Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: pxa270-c0 Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: cortex-a15 Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: pxa270-b0 Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: cortex-m4 Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: pxa270-a0 Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: arm1176 Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: pxa270-b1 Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: cortex-a7 Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: pxa270-a1 Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: cortex-a8 Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: cortex-r5 Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: ti925t Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: cortex-r5f Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: arm1026 Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: cortex-a9 Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: cortex-m7 Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: pxa270 Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: pxa260 Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: pxa250 Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: pxa270-c5 Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: pxa261 Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: pxa262 Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: sa1110 Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: sa1100 Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: max Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: cortex-m0 Aug 30 14:01:13.931817 np0035104604 nova-compute[107505]: cortex-m33 Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: arm946 Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: pxa255 Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: arm11mpcore Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: cortex-m55 Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: arm926 Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: arm1136 Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: arm1136-r2 Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: cortex-m3 Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: file Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: anonymous Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: memfd Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: disk Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: cdrom Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: floppy Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: lun Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: fdc Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: usb Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: sata Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: sdl Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: vnc Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: spice Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: egl-headless Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: subsystem Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: default Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: mandatory Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: requisite Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: optional Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: usb Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: pci Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:13.934750 np0035104604 nova-compute[107505]: Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: random Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: egd Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: builtin Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: path Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: handle Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: virtiofs Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: tpm-tis Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: passthrough Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: 2 Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: 3 Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: {{(pid=107505) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Getting domain capabilities for armv7l via machine types: {'virt'} {{(pid=107505) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Libvirt host hypervisor capabilities for arch=armv7l and machine_type=virt: Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-arm Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: qemu Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: virt-6.2 Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: armv7l Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: Aug 30 14:01:13.939163 np0035104604 nova-compute[107505]: Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: rom Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: pflash Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: yes Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: no Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: no Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: on Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: off Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: pxa270-c0 Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: cortex-a15 Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: pxa270-b0 Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: cortex-m4 Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: pxa270-a0 Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: arm1176 Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: pxa270-b1 Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: cortex-a7 Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: pxa270-a1 Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: cortex-a8 Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: cortex-r5 Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: ti925t Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: cortex-r5f Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: arm1026 Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: cortex-a9 Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: cortex-m7 Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: pxa270 Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: pxa260 Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: pxa250 Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: pxa270-c5 Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: pxa261 Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: pxa262 Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: sa1110 Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: sa1100 Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: max Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: cortex-m0 Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: cortex-m33 Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: arm946 Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: pxa255 Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: arm11mpcore Aug 30 14:01:13.942007 np0035104604 nova-compute[107505]: cortex-m55 Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: arm926 Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: arm1136 Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: arm1136-r2 Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: cortex-m3 Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: file Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: anonymous Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: memfd Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: disk Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: cdrom Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: floppy Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: lun Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: fdc Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: usb Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: sata Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: sdl Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: vnc Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: spice Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: egl-headless Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: subsystem Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: default Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: mandatory Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: requisite Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: optional Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: usb Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: pci Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: Aug 30 14:01:13.946336 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: random Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: egd Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: builtin Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: path Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: handle Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: virtiofs Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: tpm-tis Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: passthrough Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: 2 Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: 3 Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: {{(pid=107505) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Getting domain capabilities for aarch64 via machine types: {'virt'} {{(pid=107505) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Libvirt host hypervisor capabilities for arch=aarch64 and machine_type=virt: Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-aarch64 Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: qemu Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: virt-6.2 Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: aarch64 Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: rom Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: pflash Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: Aug 30 14:01:13.950412 np0035104604 nova-compute[107505]: yes Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: no Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: no Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: on Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: off Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: pxa270-c0 Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: cortex-a15 Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: pxa270-b0 Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: cortex-a57 Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: cortex-m4 Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: pxa270-a0 Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: arm1176 Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: pxa270-b1 Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: cortex-a7 Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: pxa270-a1 Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: a64fx Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: cortex-a8 Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: cortex-r5 Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: ti925t Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: cortex-r5f Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: arm1026 Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: cortex-a9 Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: cortex-m7 Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: pxa270 Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: pxa260 Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: pxa250 Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: pxa270-c5 Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: pxa261 Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: pxa262 Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: sa1110 Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: sa1100 Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: max Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: cortex-a53 Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: cortex-m0 Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: cortex-m33 Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: cortex-a72 Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: arm946 Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: pxa255 Aug 30 14:01:13.952760 np0035104604 nova-compute[107505]: arm11mpcore Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: cortex-m55 Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: arm926 Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: arm1136 Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: arm1136-r2 Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: cortex-m3 Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: file Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: anonymous Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: memfd Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: disk Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: cdrom Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: floppy Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: lun Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: fdc Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: usb Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: sata Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: sdl Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: vnc Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: spice Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: egl-headless Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: subsystem Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: default Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: mandatory Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: requisite Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: optional Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: usb Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: pci Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: Aug 30 14:01:13.956369 np0035104604 nova-compute[107505]: Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: random Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: egd Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: builtin Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: path Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: handle Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: virtiofs Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: tpm-tis Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: passthrough Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: 2 Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: 3 Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: {{(pid=107505) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Getting domain capabilities for cris via machine types: {None} {{(pid=107505) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Libvirt host hypervisor capabilities for arch=cris and machine_type=None: Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-cris Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: qemu Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: axis-dev88 Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: cris Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: rom Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: pflash Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: Aug 30 14:01:13.960843 np0035104604 nova-compute[107505]: yes Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: no Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: no Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: file Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: anonymous Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: memfd Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: disk Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: cdrom Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: floppy Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: lun Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: fdc Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: sdl Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: vnc Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: spice Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: egl-headless Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: subsystem Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: default Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: mandatory Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: requisite Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: optional Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: pci Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: random Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: egd Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: builtin Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: path Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: handle Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: Aug 30 14:01:13.964141 np0035104604 nova-compute[107505]: Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: {{(pid=107505) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Getting domain capabilities for i686 via machine types: {'ubuntu-q35', 'pc', 'ubuntu', 'q35'} {{(pid=107505) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=ubuntu-q35: Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-i386 Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: qemu Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: pc-q35-jammy Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: i686 Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: rom Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: pflash Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: yes Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: no Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: no Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: on Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: off Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: EPYC Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: AMD Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: Aug 30 14:01:13.967496 np0035104604 nova-compute[107505]: Aug 30 14:01:13.970594 np0035104604 nova-compute[107505]: Aug 30 14:01:13.970594 np0035104604 nova-compute[107505]: Aug 30 14:01:13.970594 np0035104604 nova-compute[107505]: Aug 30 14:01:13.970594 np0035104604 nova-compute[107505]: Aug 30 14:01:13.970594 np0035104604 nova-compute[107505]: Aug 30 14:01:13.970594 np0035104604 nova-compute[107505]: Aug 30 14:01:13.970594 np0035104604 nova-compute[107505]: Aug 30 14:01:13.970594 np0035104604 nova-compute[107505]: Aug 30 14:01:13.970594 np0035104604 nova-compute[107505]: Aug 30 14:01:13.970594 np0035104604 nova-compute[107505]: Aug 30 14:01:13.970594 np0035104604 nova-compute[107505]: Aug 30 14:01:13.970594 np0035104604 nova-compute[107505]: Aug 30 14:01:13.970594 np0035104604 nova-compute[107505]: Aug 30 14:01:13.970594 np0035104604 nova-compute[107505]: Aug 30 14:01:13.970594 np0035104604 nova-compute[107505]: Aug 30 14:01:13.970594 np0035104604 nova-compute[107505]: Aug 30 14:01:13.970594 np0035104604 nova-compute[107505]: Aug 30 14:01:13.970594 np0035104604 nova-compute[107505]: Aug 30 14:01:13.970594 np0035104604 nova-compute[107505]: Aug 30 14:01:13.970594 np0035104604 nova-compute[107505]: Aug 30 14:01:13.970594 np0035104604 nova-compute[107505]: Aug 30 14:01:13.970594 np0035104604 nova-compute[107505]: Aug 30 14:01:13.970594 np0035104604 nova-compute[107505]: Aug 30 14:01:13.970594 np0035104604 nova-compute[107505]: Aug 30 14:01:13.970594 np0035104604 nova-compute[107505]: Aug 30 14:01:13.970594 np0035104604 nova-compute[107505]: Aug 30 14:01:13.970594 np0035104604 nova-compute[107505]: Aug 30 14:01:13.970594 np0035104604 nova-compute[107505]: qemu64 Aug 30 14:01:13.970594 np0035104604 nova-compute[107505]: qemu32 Aug 30 14:01:13.970594 np0035104604 nova-compute[107505]: phenom Aug 30 14:01:13.970594 np0035104604 nova-compute[107505]: pentium3 Aug 30 14:01:13.970594 np0035104604 nova-compute[107505]: pentium2 Aug 30 14:01:13.970594 np0035104604 nova-compute[107505]: pentium Aug 30 14:01:13.970594 np0035104604 nova-compute[107505]: n270 Aug 30 14:01:13.970594 np0035104604 nova-compute[107505]: kvm64 Aug 30 14:01:13.970594 np0035104604 nova-compute[107505]: kvm32 Aug 30 14:01:13.970594 np0035104604 nova-compute[107505]: coreduo Aug 30 14:01:13.970594 np0035104604 nova-compute[107505]: core2duo Aug 30 14:01:13.970594 np0035104604 nova-compute[107505]: athlon Aug 30 14:01:13.970594 np0035104604 nova-compute[107505]: Westmere-IBRS Aug 30 14:01:13.970594 np0035104604 nova-compute[107505]: Westmere Aug 30 14:01:13.970594 np0035104604 nova-compute[107505]: Snowridge Aug 30 14:01:13.970594 np0035104604 nova-compute[107505]: Skylake-Server-noTSX-IBRS Aug 30 14:01:13.970594 np0035104604 nova-compute[107505]: Skylake-Server-IBRS Aug 30 14:01:13.970594 np0035104604 nova-compute[107505]: Skylake-Server Aug 30 14:01:13.972255 np0035104604 nova-compute[107505]: Skylake-Client-noTSX-IBRS Aug 30 14:01:13.972255 np0035104604 nova-compute[107505]: Skylake-Client-IBRS Aug 30 14:01:13.972255 np0035104604 nova-compute[107505]: Skylake-Client Aug 30 14:01:13.972255 np0035104604 nova-compute[107505]: SandyBridge-IBRS Aug 30 14:01:13.972255 np0035104604 nova-compute[107505]: SandyBridge Aug 30 14:01:13.972255 np0035104604 nova-compute[107505]: Penryn Aug 30 14:01:13.972255 np0035104604 nova-compute[107505]: Opteron_G5 Aug 30 14:01:13.972255 np0035104604 nova-compute[107505]: Opteron_G4 Aug 30 14:01:13.972255 np0035104604 nova-compute[107505]: Opteron_G3 Aug 30 14:01:13.972255 np0035104604 nova-compute[107505]: Opteron_G2 Aug 30 14:01:13.972255 np0035104604 nova-compute[107505]: Opteron_G1 Aug 30 14:01:13.972255 np0035104604 nova-compute[107505]: Nehalem-IBRS Aug 30 14:01:13.972255 np0035104604 nova-compute[107505]: Nehalem Aug 30 14:01:13.972255 np0035104604 nova-compute[107505]: IvyBridge-IBRS Aug 30 14:01:13.972255 np0035104604 nova-compute[107505]: IvyBridge Aug 30 14:01:13.972255 np0035104604 nova-compute[107505]: Icelake-Server-noTSX Aug 30 14:01:13.972255 np0035104604 nova-compute[107505]: Icelake-Server Aug 30 14:01:13.972255 np0035104604 nova-compute[107505]: Icelake-Client-noTSX Aug 30 14:01:13.972255 np0035104604 nova-compute[107505]: Icelake-Client Aug 30 14:01:13.972255 np0035104604 nova-compute[107505]: Haswell-noTSX-IBRS Aug 30 14:01:13.972255 np0035104604 nova-compute[107505]: Haswell-noTSX Aug 30 14:01:13.972255 np0035104604 nova-compute[107505]: Haswell-IBRS Aug 30 14:01:13.972255 np0035104604 nova-compute[107505]: Haswell Aug 30 14:01:13.972255 np0035104604 nova-compute[107505]: EPYC-Rome Aug 30 14:01:13.972255 np0035104604 nova-compute[107505]: EPYC-Milan Aug 30 14:01:13.972255 np0035104604 nova-compute[107505]: EPYC-IBPB Aug 30 14:01:13.972255 np0035104604 nova-compute[107505]: EPYC Aug 30 14:01:13.972255 np0035104604 nova-compute[107505]: Dhyana Aug 30 14:01:13.972255 np0035104604 nova-compute[107505]: Cooperlake Aug 30 14:01:13.972255 np0035104604 nova-compute[107505]: Conroe Aug 30 14:01:13.972255 np0035104604 nova-compute[107505]: Cascadelake-Server-noTSX Aug 30 14:01:13.972255 np0035104604 nova-compute[107505]: Cascadelake-Server Aug 30 14:01:13.972255 np0035104604 nova-compute[107505]: Broadwell-noTSX-IBRS Aug 30 14:01:13.972255 np0035104604 nova-compute[107505]: Broadwell-noTSX Aug 30 14:01:13.972255 np0035104604 nova-compute[107505]: Broadwell-IBRS Aug 30 14:01:13.972255 np0035104604 nova-compute[107505]: Broadwell Aug 30 14:01:13.972255 np0035104604 nova-compute[107505]: 486 Aug 30 14:01:13.972255 np0035104604 nova-compute[107505]: Aug 30 14:01:13.972255 np0035104604 nova-compute[107505]: Aug 30 14:01:13.972255 np0035104604 nova-compute[107505]: Aug 30 14:01:13.972255 np0035104604 nova-compute[107505]: Aug 30 14:01:13.972255 np0035104604 nova-compute[107505]: file Aug 30 14:01:13.972255 np0035104604 nova-compute[107505]: anonymous Aug 30 14:01:13.972255 np0035104604 nova-compute[107505]: memfd Aug 30 14:01:13.972255 np0035104604 nova-compute[107505]: Aug 30 14:01:13.972255 np0035104604 nova-compute[107505]: Aug 30 14:01:13.972255 np0035104604 nova-compute[107505]: Aug 30 14:01:13.972255 np0035104604 nova-compute[107505]: Aug 30 14:01:13.972255 np0035104604 nova-compute[107505]: Aug 30 14:01:13.972255 np0035104604 nova-compute[107505]: disk Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: cdrom Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: floppy Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: lun Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: fdc Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: usb Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: sata Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: sdl Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: vnc Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: spice Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: egl-headless Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: subsystem Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: default Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: mandatory Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: requisite Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: optional Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: usb Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: pci Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: random Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: egd Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: builtin Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: path Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: handle Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: virtiofs Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: Aug 30 14:01:13.975837 np0035104604 nova-compute[107505]: tpm-tis Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: tpm-crb Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: passthrough Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: {{(pid=107505) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc: Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-i386 Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: qemu Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: pc-i440fx-6.2 Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: i686 Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: rom Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: pflash Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: yes Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: no Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: no Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: on Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: off Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: EPYC Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: AMD Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.980479 np0035104604 nova-compute[107505]: Aug 30 14:01:13.985639 np0035104604 nova-compute[107505]: Aug 30 14:01:13.985639 np0035104604 nova-compute[107505]: Aug 30 14:01:13.985639 np0035104604 nova-compute[107505]: Aug 30 14:01:13.985639 np0035104604 nova-compute[107505]: Aug 30 14:01:13.985639 np0035104604 nova-compute[107505]: Aug 30 14:01:13.985639 np0035104604 nova-compute[107505]: Aug 30 14:01:13.985639 np0035104604 nova-compute[107505]: Aug 30 14:01:13.985639 np0035104604 nova-compute[107505]: Aug 30 14:01:13.985639 np0035104604 nova-compute[107505]: Aug 30 14:01:13.985639 np0035104604 nova-compute[107505]: Aug 30 14:01:13.985639 np0035104604 nova-compute[107505]: Aug 30 14:01:13.985639 np0035104604 nova-compute[107505]: Aug 30 14:01:13.985639 np0035104604 nova-compute[107505]: Aug 30 14:01:13.985639 np0035104604 nova-compute[107505]: Aug 30 14:01:13.985639 np0035104604 nova-compute[107505]: Aug 30 14:01:13.985639 np0035104604 nova-compute[107505]: Aug 30 14:01:13.985639 np0035104604 nova-compute[107505]: Aug 30 14:01:13.985639 np0035104604 nova-compute[107505]: Aug 30 14:01:13.985639 np0035104604 nova-compute[107505]: Aug 30 14:01:13.985639 np0035104604 nova-compute[107505]: Aug 30 14:01:13.985639 np0035104604 nova-compute[107505]: Aug 30 14:01:13.985639 np0035104604 nova-compute[107505]: Aug 30 14:01:13.985639 np0035104604 nova-compute[107505]: Aug 30 14:01:13.985639 np0035104604 nova-compute[107505]: qemu64 Aug 30 14:01:13.985639 np0035104604 nova-compute[107505]: qemu32 Aug 30 14:01:13.985639 np0035104604 nova-compute[107505]: phenom Aug 30 14:01:13.985639 np0035104604 nova-compute[107505]: pentium3 Aug 30 14:01:13.985639 np0035104604 nova-compute[107505]: pentium2 Aug 30 14:01:13.985639 np0035104604 nova-compute[107505]: pentium Aug 30 14:01:13.985639 np0035104604 nova-compute[107505]: n270 Aug 30 14:01:13.985639 np0035104604 nova-compute[107505]: kvm64 Aug 30 14:01:13.985639 np0035104604 nova-compute[107505]: kvm32 Aug 30 14:01:13.985639 np0035104604 nova-compute[107505]: coreduo Aug 30 14:01:13.985639 np0035104604 nova-compute[107505]: core2duo Aug 30 14:01:13.985639 np0035104604 nova-compute[107505]: athlon Aug 30 14:01:13.985639 np0035104604 nova-compute[107505]: Westmere-IBRS Aug 30 14:01:13.985639 np0035104604 nova-compute[107505]: Westmere Aug 30 14:01:13.985639 np0035104604 nova-compute[107505]: Snowridge Aug 30 14:01:13.985639 np0035104604 nova-compute[107505]: Skylake-Server-noTSX-IBRS Aug 30 14:01:13.985639 np0035104604 nova-compute[107505]: Skylake-Server-IBRS Aug 30 14:01:13.985639 np0035104604 nova-compute[107505]: Skylake-Server Aug 30 14:01:13.985639 np0035104604 nova-compute[107505]: Skylake-Client-noTSX-IBRS Aug 30 14:01:13.985639 np0035104604 nova-compute[107505]: Skylake-Client-IBRS Aug 30 14:01:13.985639 np0035104604 nova-compute[107505]: Skylake-Client Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: SandyBridge-IBRS Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: SandyBridge Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: Penryn Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: Opteron_G5 Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: Opteron_G4 Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: Opteron_G3 Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: Opteron_G2 Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: Opteron_G1 Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: Nehalem-IBRS Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: Nehalem Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: IvyBridge-IBRS Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: IvyBridge Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: Icelake-Server-noTSX Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: Icelake-Server Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: Icelake-Client-noTSX Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: Icelake-Client Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: Haswell-noTSX-IBRS Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: Haswell-noTSX Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: Haswell-IBRS Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: Haswell Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: EPYC-Rome Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: EPYC-Milan Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: EPYC-IBPB Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: EPYC Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: Dhyana Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: Cooperlake Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: Conroe Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: Cascadelake-Server-noTSX Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: Cascadelake-Server Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: Broadwell-noTSX-IBRS Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: Broadwell-noTSX Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: Broadwell-IBRS Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: Broadwell Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: 486 Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: file Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: anonymous Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: memfd Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: disk Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: cdrom Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: floppy Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: lun Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: Aug 30 14:01:13.989185 np0035104604 nova-compute[107505]: ide Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: fdc Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: usb Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: sata Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: sdl Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: vnc Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: spice Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: egl-headless Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: subsystem Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: default Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: mandatory Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: requisite Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: optional Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: usb Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: pci Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: random Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: egd Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: builtin Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: path Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: handle Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: virtiofs Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: tpm-tis Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: tpm-crb Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: Aug 30 14:01:13.993649 np0035104604 nova-compute[107505]: passthrough Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: {{(pid=107505) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=ubuntu: Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-i386 Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: qemu Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: pc-i440fx-jammy Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: i686 Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: rom Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: pflash Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: yes Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: no Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: no Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: on Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: off Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: EPYC Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: AMD Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: Aug 30 14:01:13.999812 np0035104604 nova-compute[107505]: Aug 30 14:01:14.004807 np0035104604 nova-compute[107505]: Aug 30 14:01:14.004807 np0035104604 nova-compute[107505]: Aug 30 14:01:14.004807 np0035104604 nova-compute[107505]: Aug 30 14:01:14.004807 np0035104604 nova-compute[107505]: Aug 30 14:01:14.004807 np0035104604 nova-compute[107505]: Aug 30 14:01:14.004807 np0035104604 nova-compute[107505]: Aug 30 14:01:14.004807 np0035104604 nova-compute[107505]: Aug 30 14:01:14.004807 np0035104604 nova-compute[107505]: Aug 30 14:01:14.004807 np0035104604 nova-compute[107505]: Aug 30 14:01:14.004807 np0035104604 nova-compute[107505]: Aug 30 14:01:14.004807 np0035104604 nova-compute[107505]: Aug 30 14:01:14.004807 np0035104604 nova-compute[107505]: Aug 30 14:01:14.004807 np0035104604 nova-compute[107505]: Aug 30 14:01:14.004807 np0035104604 nova-compute[107505]: Aug 30 14:01:14.004807 np0035104604 nova-compute[107505]: Aug 30 14:01:14.004807 np0035104604 nova-compute[107505]: Aug 30 14:01:14.004807 np0035104604 nova-compute[107505]: Aug 30 14:01:14.004807 np0035104604 nova-compute[107505]: Aug 30 14:01:14.004807 np0035104604 nova-compute[107505]: Aug 30 14:01:14.004807 np0035104604 nova-compute[107505]: Aug 30 14:01:14.004807 np0035104604 nova-compute[107505]: qemu64 Aug 30 14:01:14.004807 np0035104604 nova-compute[107505]: qemu32 Aug 30 14:01:14.004807 np0035104604 nova-compute[107505]: phenom Aug 30 14:01:14.004807 np0035104604 nova-compute[107505]: pentium3 Aug 30 14:01:14.004807 np0035104604 nova-compute[107505]: pentium2 Aug 30 14:01:14.004807 np0035104604 nova-compute[107505]: pentium Aug 30 14:01:14.004807 np0035104604 nova-compute[107505]: n270 Aug 30 14:01:14.004807 np0035104604 nova-compute[107505]: kvm64 Aug 30 14:01:14.004807 np0035104604 nova-compute[107505]: kvm32 Aug 30 14:01:14.004807 np0035104604 nova-compute[107505]: coreduo Aug 30 14:01:14.004807 np0035104604 nova-compute[107505]: core2duo Aug 30 14:01:14.004807 np0035104604 nova-compute[107505]: athlon Aug 30 14:01:14.004807 np0035104604 nova-compute[107505]: Westmere-IBRS Aug 30 14:01:14.004807 np0035104604 nova-compute[107505]: Westmere Aug 30 14:01:14.004807 np0035104604 nova-compute[107505]: Snowridge Aug 30 14:01:14.004807 np0035104604 nova-compute[107505]: Skylake-Server-noTSX-IBRS Aug 30 14:01:14.004807 np0035104604 nova-compute[107505]: Skylake-Server-IBRS Aug 30 14:01:14.004807 np0035104604 nova-compute[107505]: Skylake-Server Aug 30 14:01:14.004807 np0035104604 nova-compute[107505]: Skylake-Client-noTSX-IBRS Aug 30 14:01:14.004807 np0035104604 nova-compute[107505]: Skylake-Client-IBRS Aug 30 14:01:14.004807 np0035104604 nova-compute[107505]: Skylake-Client Aug 30 14:01:14.004807 np0035104604 nova-compute[107505]: SandyBridge-IBRS Aug 30 14:01:14.004807 np0035104604 nova-compute[107505]: SandyBridge Aug 30 14:01:14.004807 np0035104604 nova-compute[107505]: Penryn Aug 30 14:01:14.004807 np0035104604 nova-compute[107505]: Opteron_G5 Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: Opteron_G4 Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: Opteron_G3 Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: Opteron_G2 Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: Opteron_G1 Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: Nehalem-IBRS Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: Nehalem Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: IvyBridge-IBRS Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: IvyBridge Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: Icelake-Server-noTSX Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: Icelake-Server Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: Icelake-Client-noTSX Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: Icelake-Client Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: Haswell-noTSX-IBRS Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: Haswell-noTSX Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: Haswell-IBRS Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: Haswell Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: EPYC-Rome Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: EPYC-Milan Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: EPYC-IBPB Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: EPYC Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: Dhyana Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: Cooperlake Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: Conroe Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: Cascadelake-Server-noTSX Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: Cascadelake-Server Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: Broadwell-noTSX-IBRS Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: Broadwell-noTSX Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: Broadwell-IBRS Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: Broadwell Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: 486 Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: file Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: anonymous Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: memfd Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: disk Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: cdrom Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: floppy Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: lun Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: ide Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: fdc Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: usb Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: sata Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: Aug 30 14:01:14.007711 np0035104604 nova-compute[107505]: Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: sdl Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: vnc Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: spice Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: egl-headless Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: subsystem Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: default Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: mandatory Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: requisite Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: optional Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: usb Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: pci Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: random Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: egd Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: builtin Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: path Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: handle Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: virtiofs Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: tpm-tis Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: tpm-crb Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: passthrough Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: Aug 30 14:01:14.010500 np0035104604 nova-compute[107505]: Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: {{(pid=107505) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35: Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-i386 Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: qemu Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: pc-q35-6.2 Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: i686 Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: rom Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: pflash Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: yes Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: on Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: off Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: EPYC Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: AMD Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: Aug 30 14:01:14.014399 np0035104604 nova-compute[107505]: Aug 30 14:01:14.016927 np0035104604 nova-compute[107505]: Aug 30 14:01:14.016927 np0035104604 nova-compute[107505]: Aug 30 14:01:14.016927 np0035104604 nova-compute[107505]: Aug 30 14:01:14.016927 np0035104604 nova-compute[107505]: Aug 30 14:01:14.016927 np0035104604 nova-compute[107505]: Aug 30 14:01:14.016927 np0035104604 nova-compute[107505]: Aug 30 14:01:14.016927 np0035104604 nova-compute[107505]: Aug 30 14:01:14.016927 np0035104604 nova-compute[107505]: Aug 30 14:01:14.016927 np0035104604 nova-compute[107505]: Aug 30 14:01:14.016927 np0035104604 nova-compute[107505]: Aug 30 14:01:14.016927 np0035104604 nova-compute[107505]: Aug 30 14:01:14.016927 np0035104604 nova-compute[107505]: Aug 30 14:01:14.016927 np0035104604 nova-compute[107505]: Aug 30 14:01:14.016927 np0035104604 nova-compute[107505]: Aug 30 14:01:14.016927 np0035104604 nova-compute[107505]: Aug 30 14:01:14.016927 np0035104604 nova-compute[107505]: Aug 30 14:01:14.016927 np0035104604 nova-compute[107505]: qemu64 Aug 30 14:01:14.016927 np0035104604 nova-compute[107505]: qemu32 Aug 30 14:01:14.016927 np0035104604 nova-compute[107505]: phenom Aug 30 14:01:14.016927 np0035104604 nova-compute[107505]: pentium3 Aug 30 14:01:14.016927 np0035104604 nova-compute[107505]: pentium2 Aug 30 14:01:14.016927 np0035104604 nova-compute[107505]: pentium Aug 30 14:01:14.016927 np0035104604 nova-compute[107505]: n270 Aug 30 14:01:14.016927 np0035104604 nova-compute[107505]: kvm64 Aug 30 14:01:14.016927 np0035104604 nova-compute[107505]: kvm32 Aug 30 14:01:14.016927 np0035104604 nova-compute[107505]: coreduo Aug 30 14:01:14.016927 np0035104604 nova-compute[107505]: core2duo Aug 30 14:01:14.016927 np0035104604 nova-compute[107505]: athlon Aug 30 14:01:14.016927 np0035104604 nova-compute[107505]: Westmere-IBRS Aug 30 14:01:14.016927 np0035104604 nova-compute[107505]: Westmere Aug 30 14:01:14.016927 np0035104604 nova-compute[107505]: Snowridge Aug 30 14:01:14.016927 np0035104604 nova-compute[107505]: Skylake-Server-noTSX-IBRS Aug 30 14:01:14.016927 np0035104604 nova-compute[107505]: Skylake-Server-IBRS Aug 30 14:01:14.016927 np0035104604 nova-compute[107505]: Skylake-Server Aug 30 14:01:14.016927 np0035104604 nova-compute[107505]: Skylake-Client-noTSX-IBRS Aug 30 14:01:14.016927 np0035104604 nova-compute[107505]: Skylake-Client-IBRS Aug 30 14:01:14.016927 np0035104604 nova-compute[107505]: Skylake-Client Aug 30 14:01:14.016927 np0035104604 nova-compute[107505]: SandyBridge-IBRS Aug 30 14:01:14.016927 np0035104604 nova-compute[107505]: SandyBridge Aug 30 14:01:14.016927 np0035104604 nova-compute[107505]: Penryn Aug 30 14:01:14.016927 np0035104604 nova-compute[107505]: Opteron_G5 Aug 30 14:01:14.016927 np0035104604 nova-compute[107505]: Opteron_G4 Aug 30 14:01:14.016927 np0035104604 nova-compute[107505]: Opteron_G3 Aug 30 14:01:14.016927 np0035104604 nova-compute[107505]: Opteron_G2 Aug 30 14:01:14.016927 np0035104604 nova-compute[107505]: Opteron_G1 Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: Nehalem-IBRS Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: Nehalem Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: IvyBridge-IBRS Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: IvyBridge Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: Icelake-Server-noTSX Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: Icelake-Server Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: Icelake-Client-noTSX Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: Icelake-Client Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: Haswell-noTSX-IBRS Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: Haswell-noTSX Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: Haswell-IBRS Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: Haswell Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: EPYC-Rome Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: EPYC-Milan Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: EPYC-IBPB Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: EPYC Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: Dhyana Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: Cooperlake Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: Conroe Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: Cascadelake-Server-noTSX Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: Cascadelake-Server Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: Broadwell-noTSX-IBRS Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: Broadwell-noTSX Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: Broadwell-IBRS Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: Broadwell Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: 486 Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: file Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: anonymous Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: memfd Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: disk Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: cdrom Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: floppy Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: lun Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: fdc Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: usb Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: sata Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.019563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: sdl Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: vnc Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: spice Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: egl-headless Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: subsystem Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: default Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: mandatory Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: requisite Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: optional Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: usb Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: pci Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: random Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: egd Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: builtin Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: path Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: handle Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: virtiofs Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: tpm-tis Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: tpm-crb Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: passthrough Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: Aug 30 14:01:14.023223 np0035104604 nova-compute[107505]: {{(pid=107505) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Getting domain capabilities for m68k via machine types: {None, 'virt'} {{(pid=107505) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Libvirt host hypervisor capabilities for arch=m68k and machine_type=None: Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-m68k Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: qemu Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: mcf5208evb Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: m68k Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: rom Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: pflash Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: yes Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: file Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: anonymous Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: memfd Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: disk Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: cdrom Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: floppy Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: lun Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: fdc Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: sdl Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: vnc Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: spice Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: egl-headless Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: Aug 30 14:01:14.027201 np0035104604 nova-compute[107505]: Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: subsystem Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: default Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: mandatory Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: requisite Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: optional Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: pci Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: random Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: egd Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: builtin Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: path Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: handle Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: virtiofs Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: {{(pid=107505) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Libvirt host hypervisor capabilities for arch=m68k and machine_type=virt: Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-m68k Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: qemu Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: virt-6.2 Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: m68k Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: rom Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: pflash Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: Aug 30 14:01:14.030141 np0035104604 nova-compute[107505]: yes Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: file Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: anonymous Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: memfd Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: disk Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: cdrom Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: floppy Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: lun Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: fdc Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: sdl Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: vnc Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: spice Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: egl-headless Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: subsystem Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: default Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: mandatory Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: requisite Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: optional Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: pci Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: random Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: egd Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: builtin Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: Aug 30 14:01:14.034216 np0035104604 nova-compute[107505]: path Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: handle Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: virtiofs Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: {{(pid=107505) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Getting domain capabilities for microblaze via machine types: {None} {{(pid=107505) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Libvirt host hypervisor capabilities for arch=microblaze and machine_type=None: Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-microblaze Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: qemu Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: petalogix-s3adsp1800 Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: microblaze Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: rom Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: pflash Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: yes Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: file Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: anonymous Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: memfd Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: Aug 30 14:01:14.038796 np0035104604 nova-compute[107505]: disk Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: cdrom Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: floppy Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: lun Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: fdc Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: sdl Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: vnc Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: spice Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: egl-headless Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: subsystem Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: default Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: mandatory Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: requisite Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: optional Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: pci Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: random Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: egd Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: builtin Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: path Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: handle Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: Aug 30 14:01:14.042048 np0035104604 nova-compute[107505]: {{(pid=107505) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Getting domain capabilities for microblazeel via machine types: {None} {{(pid=107505) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Libvirt host hypervisor capabilities for arch=microblazeel and machine_type=None: Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-microblazeel Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: qemu Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: petalogix-s3adsp1800 Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: microblazeel Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: rom Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: pflash Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: yes Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: file Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: anonymous Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: memfd Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: disk Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: cdrom Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: floppy Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: lun Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: fdc Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: sdl Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: vnc Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: spice Aug 30 14:01:14.046905 np0035104604 nova-compute[107505]: egl-headless Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: subsystem Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: default Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: mandatory Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: requisite Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: optional Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: pci Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: random Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: egd Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: builtin Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: path Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: handle Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: {{(pid=107505) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Getting domain capabilities for mips via machine types: {None} {{(pid=107505) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Libvirt host hypervisor capabilities for arch=mips and machine_type=None: Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-mips Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: qemu Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: malta Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: mips Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: Aug 30 14:01:14.051308 np0035104604 nova-compute[107505]: Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: rom Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: pflash Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: yes Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: file Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: anonymous Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: memfd Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: disk Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: cdrom Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: floppy Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: lun Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: ide Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: fdc Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: usb Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: sata Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: sdl Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: vnc Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: spice Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: egl-headless Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: subsystem Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: default Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: mandatory Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: requisite Aug 30 14:01:14.055099 np0035104604 nova-compute[107505]: optional Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: usb Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: pci Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: random Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: egd Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: builtin Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: path Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: handle Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: virtiofs Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: {{(pid=107505) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Getting domain capabilities for mipsel via machine types: {None} {{(pid=107505) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Libvirt host hypervisor capabilities for arch=mipsel and machine_type=None: Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-mipsel Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: qemu Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: malta Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: mipsel Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: rom Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: pflash Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: Aug 30 14:01:14.059015 np0035104604 nova-compute[107505]: Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: yes Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: file Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: anonymous Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: memfd Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: disk Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: cdrom Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: floppy Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: lun Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: ide Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: fdc Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: usb Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: sata Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: sdl Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: vnc Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: spice Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: egl-headless Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: subsystem Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: default Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: mandatory Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: requisite Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: optional Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: usb Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: pci Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: Aug 30 14:01:14.062021 np0035104604 nova-compute[107505]: Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: random Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: egd Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: builtin Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: path Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: handle Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: virtiofs Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: {{(pid=107505) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Getting domain capabilities for mips64 via machine types: {None} {{(pid=107505) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Libvirt host hypervisor capabilities for arch=mips64 and machine_type=None: Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-mips64 Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: qemu Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: malta Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: mips64 Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: rom Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: pflash Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: yes Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: Aug 30 14:01:14.066828 np0035104604 nova-compute[107505]: Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: file Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: anonymous Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: memfd Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: disk Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: cdrom Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: floppy Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: lun Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: ide Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: fdc Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: usb Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: sata Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: sdl Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: vnc Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: spice Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: egl-headless Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: subsystem Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: default Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: mandatory Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: requisite Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: optional Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: usb Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: pci Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.071006 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: random Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: egd Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: builtin Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: path Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: handle Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: virtiofs Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: {{(pid=107505) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Getting domain capabilities for mips64el via machine types: {None} {{(pid=107505) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Libvirt host hypervisor capabilities for arch=mips64el and machine_type=None: Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-mips64el Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: qemu Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: malta Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: mips64el Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: rom Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: pflash Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: yes Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: Aug 30 14:01:14.076194 np0035104604 nova-compute[107505]: Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: file Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: anonymous Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: memfd Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: disk Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: cdrom Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: floppy Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: lun Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: ide Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: fdc Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: usb Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: sata Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: sdl Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: vnc Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: spice Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: egl-headless Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: subsystem Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: default Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: mandatory Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: requisite Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: optional Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: usb Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: pci Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: random Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: egd Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: builtin Aug 30 14:01:14.080162 np0035104604 nova-compute[107505]: Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: path Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: handle Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: virtiofs Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: {{(pid=107505) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Getting domain capabilities for ppc via machine types: {None} {{(pid=107505) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Libvirt host hypervisor capabilities for arch=ppc and machine_type=None: Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-ppc Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: qemu Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: g3beige Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: ppc Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: rom Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: pflash Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: yes Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: on Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: off Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: Aug 30 14:01:14.086943 np0035104604 nova-compute[107505]: file Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: anonymous Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: memfd Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: disk Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: cdrom Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: floppy Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: lun Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: ide Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: fdc Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: usb Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: sata Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: sdl Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: vnc Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: spice Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: egl-headless Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: subsystem Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: default Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: mandatory Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: requisite Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: optional Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: usb Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: pci Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: random Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: egd Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: builtin Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: Aug 30 14:01:14.091289 np0035104604 nova-compute[107505]: Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: path Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: handle Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: virtiofs Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: {{(pid=107505) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Getting domain capabilities for ppc64 via machine types: {'pseries', 'powernv', None} {{(pid=107505) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Libvirt host hypervisor capabilities for arch=ppc64 and machine_type=pseries: Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-ppc64 Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: qemu Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: pseries-jammy Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: ppc64 Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: rom Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: pflash Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: yes Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: on Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: off Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: POWER9 Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: POWER8 Aug 30 14:01:14.095883 np0035104604 nova-compute[107505]: POWER7 Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: file Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: anonymous Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: memfd Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: disk Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: cdrom Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: lun Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: usb Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: sata Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: sdl Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: vnc Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: spice Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: egl-headless Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: subsystem Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: default Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: mandatory Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: requisite Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: optional Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: usb Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: pci Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: random Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: egd Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: builtin Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: Aug 30 14:01:14.100138 np0035104604 nova-compute[107505]: Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: path Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: handle Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: virtiofs Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: tpm-spapr Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: spapr-tpm-proxy Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: passthrough Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: {{(pid=107505) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Libvirt host hypervisor capabilities for arch=ppc64 and machine_type=powernv: Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-ppc64 Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: qemu Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: powernv9 Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: ppc64 Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: rom Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: pflash Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: yes Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: on Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: off Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: POWER9 Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: POWER8 Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: POWER7 Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: Aug 30 14:01:14.105297 np0035104604 nova-compute[107505]: Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: file Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: anonymous Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: memfd Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: disk Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: cdrom Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: floppy Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: lun Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: fdc Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: usb Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: sata Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: sdl Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: vnc Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: spice Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: egl-headless Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: subsystem Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: default Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: mandatory Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: requisite Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: optional Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: usb Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: pci Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: random Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: egd Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: builtin Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: Aug 30 14:01:14.107893 np0035104604 nova-compute[107505]: Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: path Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: handle Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: virtiofs Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: tpm-spapr Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: spapr-tpm-proxy Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: passthrough Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: {{(pid=107505) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Libvirt host hypervisor capabilities for arch=ppc64 and machine_type=None: Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-ppc64 Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: qemu Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: pseries-jammy Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: ppc64 Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: rom Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: pflash Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: yes Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: on Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: off Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: POWER9 Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: POWER8 Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: POWER7 Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: Aug 30 14:01:14.111024 np0035104604 nova-compute[107505]: file Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: anonymous Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: memfd Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: disk Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: cdrom Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: lun Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: usb Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: sata Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: sdl Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: vnc Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: spice Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: egl-headless Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: subsystem Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: default Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: mandatory Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: requisite Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: optional Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: usb Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: pci Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: random Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: egd Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: builtin Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: path Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: handle Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: virtiofs Aug 30 14:01:14.113463 np0035104604 nova-compute[107505]: Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: tpm-spapr Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: spapr-tpm-proxy Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: passthrough Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: {{(pid=107505) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Getting domain capabilities for ppc64le via machine types: {'pseries', 'powernv'} {{(pid=107505) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Libvirt host hypervisor capabilities for arch=ppc64le and machine_type=pseries: Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-ppc64le Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: qemu Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: pseries-jammy Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: ppc64le Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: rom Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: pflash Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: yes Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: on Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: off Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: Aug 30 14:01:14.116827 np0035104604 nova-compute[107505]: Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: POWER9 Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: POWER8 Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: POWER7 Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: file Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: anonymous Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: memfd Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: disk Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: cdrom Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: lun Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: usb Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: sata Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: sdl Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: vnc Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: spice Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: egl-headless Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: subsystem Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: default Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: mandatory Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: requisite Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: optional Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: usb Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: pci Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: Aug 30 14:01:14.119371 np0035104604 nova-compute[107505]: random Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: egd Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: builtin Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: path Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: handle Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: virtiofs Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: tpm-spapr Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: spapr-tpm-proxy Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: passthrough Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: {{(pid=107505) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Libvirt host hypervisor capabilities for arch=ppc64le and machine_type=powernv: Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-ppc64le Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: qemu Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: powernv9 Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: ppc64le Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: rom Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: pflash Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: yes Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: on Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: off Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: POWER9 Aug 30 14:01:14.123637 np0035104604 nova-compute[107505]: POWER8 Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: POWER7 Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: file Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: anonymous Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: memfd Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: disk Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: cdrom Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: floppy Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: lun Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: fdc Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: usb Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: sata Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: sdl Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: vnc Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: spice Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: egl-headless Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: subsystem Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: default Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: mandatory Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: requisite Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: optional Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: usb Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: pci Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: random Aug 30 14:01:14.128573 np0035104604 nova-compute[107505]: egd Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: builtin Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: path Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: handle Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: virtiofs Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: tpm-spapr Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: spapr-tpm-proxy Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: passthrough Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: {{(pid=107505) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Getting domain capabilities for riscv32 via machine types: {None} {{(pid=107505) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Libvirt host hypervisor capabilities for arch=riscv32 and machine_type=None: Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-riscv32 Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: qemu Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: spike Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: riscv32 Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: rom Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: pflash Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: yes Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: Aug 30 14:01:14.131758 np0035104604 nova-compute[107505]: Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: file Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: anonymous Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: memfd Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: disk Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: cdrom Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: floppy Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: lun Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: fdc Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: usb Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: sata Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: sdl Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: vnc Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: spice Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: egl-headless Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: subsystem Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: default Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: mandatory Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: requisite Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: optional Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: usb Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: pci Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: random Aug 30 14:01:14.134499 np0035104604 nova-compute[107505]: egd Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: builtin Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: path Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: handle Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: virtiofs Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: {{(pid=107505) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Getting domain capabilities for riscv64 via machine types: {None} {{(pid=107505) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Libvirt host hypervisor capabilities for arch=riscv64 and machine_type=None: Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-riscv64 Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: qemu Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: spike Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: riscv64 Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: rom Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: pflash Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: yes Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: file Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: anonymous Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: memfd Aug 30 14:01:14.137563 np0035104604 nova-compute[107505]: Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: disk Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: cdrom Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: floppy Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: lun Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: fdc Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: usb Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: sata Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: sdl Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: vnc Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: spice Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: egl-headless Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: subsystem Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: default Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: mandatory Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: requisite Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: optional Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: usb Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: pci Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: random Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: egd Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: builtin Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: path Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: handle Aug 30 14:01:14.140503 np0035104604 nova-compute[107505]: virtiofs Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: {{(pid=107505) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Getting domain capabilities for s390x via machine types: {'s390-ccw-virtio'} {{(pid=107505) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Libvirt host hypervisor capabilities for arch=s390x and machine_type=s390-ccw-virtio: Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-s390x Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: qemu Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: s390-ccw-virtio-jammy Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: s390x Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: rom Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: pflash Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: yes Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: on Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: off Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: gen16a-base Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: z800-base Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: z890.2-base Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: gen16a Aug 30 14:01:14.143736 np0035104604 nova-compute[107505]: z9EC.2 Aug 30 14:01:14.146381 np0035104604 nova-compute[107505]: z13.2 Aug 30 14:01:14.146381 np0035104604 nova-compute[107505]: z990.5-base Aug 30 14:01:14.146381 np0035104604 nova-compute[107505]: z9BC-base Aug 30 14:01:14.146381 np0035104604 nova-compute[107505]: z890.2 Aug 30 14:01:14.146381 np0035104604 nova-compute[107505]: z890 Aug 30 14:01:14.146381 np0035104604 nova-compute[107505]: z9BC Aug 30 14:01:14.146381 np0035104604 nova-compute[107505]: z13 Aug 30 14:01:14.146381 np0035104604 nova-compute[107505]: z196 Aug 30 14:01:14.146381 np0035104604 nova-compute[107505]: z13s Aug 30 14:01:14.146381 np0035104604 nova-compute[107505]: z990.3 Aug 30 14:01:14.146381 np0035104604 nova-compute[107505]: gen16b-base Aug 30 14:01:14.146381 np0035104604 nova-compute[107505]: z13s-base Aug 30 14:01:14.146381 np0035104604 nova-compute[107505]: z13.2-base Aug 30 14:01:14.146381 np0035104604 nova-compute[107505]: z9EC Aug 30 14:01:14.146381 np0035104604 nova-compute[107505]: gen15a Aug 30 14:01:14.146381 np0035104604 nova-compute[107505]: z14ZR1-base Aug 30 14:01:14.146381 np0035104604 nova-compute[107505]: z14.2-base Aug 30 14:01:14.146381 np0035104604 nova-compute[107505]: z900.3-base Aug 30 14:01:14.146381 np0035104604 nova-compute[107505]: z196.2-base Aug 30 14:01:14.146381 np0035104604 nova-compute[107505]: zBC12-base Aug 30 14:01:14.146381 np0035104604 nova-compute[107505]: z9BC.2-base Aug 30 14:01:14.146381 np0035104604 nova-compute[107505]: z900.2-base Aug 30 14:01:14.146381 np0035104604 nova-compute[107505]: z9EC.3 Aug 30 14:01:14.146381 np0035104604 nova-compute[107505]: zEC12 Aug 30 14:01:14.146381 np0035104604 nova-compute[107505]: z900 Aug 30 14:01:14.146381 np0035104604 nova-compute[107505]: z114-base Aug 30 14:01:14.146381 np0035104604 nova-compute[107505]: zEC12-base Aug 30 14:01:14.146381 np0035104604 nova-compute[107505]: z10EC.2 Aug 30 14:01:14.146381 np0035104604 nova-compute[107505]: z10EC-base Aug 30 14:01:14.146381 np0035104604 nova-compute[107505]: z900.3 Aug 30 14:01:14.146381 np0035104604 nova-compute[107505]: z14ZR1 Aug 30 14:01:14.146381 np0035104604 nova-compute[107505]: z10BC Aug 30 14:01:14.146381 np0035104604 nova-compute[107505]: z10BC.2-base Aug 30 14:01:14.146381 np0035104604 nova-compute[107505]: z9BC.2 Aug 30 14:01:14.146381 np0035104604 nova-compute[107505]: z990.2 Aug 30 14:01:14.146381 np0035104604 nova-compute[107505]: z990 Aug 30 14:01:14.146381 np0035104604 nova-compute[107505]: z14 Aug 30 14:01:14.146381 np0035104604 nova-compute[107505]: gen15b-base Aug 30 14:01:14.146381 np0035104604 nova-compute[107505]: z990.4 Aug 30 14:01:14.146381 np0035104604 nova-compute[107505]: max Aug 30 14:01:14.146381 np0035104604 nova-compute[107505]: z10EC.2-base Aug 30 14:01:14.146381 np0035104604 nova-compute[107505]: gen16b Aug 30 14:01:14.146381 np0035104604 nova-compute[107505]: gen15a-base Aug 30 14:01:14.146381 np0035104604 nova-compute[107505]: z800 Aug 30 14:01:14.146381 np0035104604 nova-compute[107505]: z10EC Aug 30 14:01:14.146381 np0035104604 nova-compute[107505]: zEC12.2 Aug 30 14:01:14.146381 np0035104604 nova-compute[107505]: z990.2-base Aug 30 14:01:14.146381 np0035104604 nova-compute[107505]: z900-base Aug 30 14:01:14.146381 np0035104604 nova-compute[107505]: z10BC.2 Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: z9EC-base Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: z9EC.3-base Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: z114 Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: z890.3 Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: z196-base Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: z9EC.2-base Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: z196.2 Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: z14.2 Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: z990-base Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: z900.2 Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: z890-base Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: z10EC.3 Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: z14-base Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: z990.4-base Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: z10EC.3-base Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: z10BC-base Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: z13-base Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: z990.3-base Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: zEC12.2-base Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: zBC12 Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: z890.3-base Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: z990.5 Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: gen15b Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: qemu Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: file Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: anonymous Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: memfd Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: disk Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: cdrom Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: floppy Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: lun Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: fdc Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: sdl Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: vnc Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: spice Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: egl-headless Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: Aug 30 14:01:14.150250 np0035104604 nova-compute[107505]: Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: subsystem Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: default Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: mandatory Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: requisite Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: optional Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: pci Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: random Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: egd Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: builtin Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: path Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: handle Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: virtiofs Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: {{(pid=107505) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Getting domain capabilities for sh4 via machine types: {None} {{(pid=107505) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Libvirt host hypervisor capabilities for arch=sh4 and machine_type=None: Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-sh4 Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: qemu Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: shix Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: sh4 Aug 30 14:01:14.154755 np0035104604 nova-compute[107505]: Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: rom Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: pflash Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: yes Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: file Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: anonymous Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: memfd Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: disk Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: cdrom Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: floppy Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: lun Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: fdc Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: usb Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: sata Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: sdl Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: vnc Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: spice Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: egl-headless Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: subsystem Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: default Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: mandatory Aug 30 14:01:14.157388 np0035104604 nova-compute[107505]: requisite Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: optional Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: usb Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: pci Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: random Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: egd Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: builtin Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: path Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: handle Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: virtiofs Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: {{(pid=107505) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Getting domain capabilities for sh4eb via machine types: {None} {{(pid=107505) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Libvirt host hypervisor capabilities for arch=sh4eb and machine_type=None: Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-sh4eb Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: qemu Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: shix Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: sh4eb Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: rom Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: pflash Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: Aug 30 14:01:14.160887 np0035104604 nova-compute[107505]: Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: yes Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: file Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: anonymous Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: memfd Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: disk Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: cdrom Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: floppy Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: lun Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: fdc Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: usb Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: sata Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: sdl Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: vnc Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: spice Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: egl-headless Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: subsystem Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: default Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: mandatory Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: requisite Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: optional Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: usb Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: pci Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: Aug 30 14:01:14.163591 np0035104604 nova-compute[107505]: Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: random Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: egd Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: builtin Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: path Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: handle Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: virtiofs Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: {{(pid=107505) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Getting domain capabilities for sparc via machine types: {None} {{(pid=107505) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Libvirt host hypervisor capabilities for arch=sparc and machine_type=None: Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-sparc Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: qemu Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: SS-5 Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: sparc Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: rom Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: pflash Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: yes Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: Aug 30 14:01:14.166920 np0035104604 nova-compute[107505]: Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: file Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: anonymous Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: memfd Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: disk Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: cdrom Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: floppy Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: lun Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: fdc Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: sdl Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: vnc Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: spice Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: egl-headless Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: subsystem Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: default Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: mandatory Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: requisite Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: optional Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: pci Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: random Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: egd Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: builtin Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: path Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: handle Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: Aug 30 14:01:14.170419 np0035104604 nova-compute[107505]: Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: {{(pid=107505) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Getting domain capabilities for sparc64 via machine types: {None} {{(pid=107505) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Libvirt host hypervisor capabilities for arch=sparc64 and machine_type=None: Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-sparc64 Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: qemu Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: sun4u Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: sparc64 Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: rom Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: pflash Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: yes Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: file Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: anonymous Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: memfd Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: disk Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: cdrom Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: floppy Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: lun Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: ide Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: fdc Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: usb Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: sata Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: Aug 30 14:01:14.173136 np0035104604 nova-compute[107505]: Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: sdl Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: vnc Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: spice Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: egl-headless Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: subsystem Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: default Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: mandatory Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: requisite Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: optional Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: usb Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: pci Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: random Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: egd Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: builtin Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: path Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: handle Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: virtiofs Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: Aug 30 14:01:14.176911 np0035104604 nova-compute[107505]: {{(pid=107505) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Aug 30 14:01:14.180474 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Getting domain capabilities for x86_64 via machine types: {'ubuntu-q35', 'pc', 'ubuntu', 'q35'} {{(pid=107505) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Aug 30 14:01:14.180474 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=ubuntu-q35: Aug 30 14:01:14.180474 np0035104604 nova-compute[107505]: Aug 30 14:01:14.180474 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-x86_64 Aug 30 14:01:14.180474 np0035104604 nova-compute[107505]: qemu Aug 30 14:01:14.180474 np0035104604 nova-compute[107505]: pc-q35-jammy Aug 30 14:01:14.180474 np0035104604 nova-compute[107505]: x86_64 Aug 30 14:01:14.180474 np0035104604 nova-compute[107505]: Aug 30 14:01:14.180474 np0035104604 nova-compute[107505]: Aug 30 14:01:14.180474 np0035104604 nova-compute[107505]: Aug 30 14:01:14.180474 np0035104604 nova-compute[107505]: Aug 30 14:01:14.180474 np0035104604 nova-compute[107505]: Aug 30 14:01:14.180474 np0035104604 nova-compute[107505]: Aug 30 14:01:14.180474 np0035104604 nova-compute[107505]: rom Aug 30 14:01:14.180474 np0035104604 nova-compute[107505]: pflash Aug 30 14:01:14.180474 np0035104604 nova-compute[107505]: Aug 30 14:01:14.180474 np0035104604 nova-compute[107505]: Aug 30 14:01:14.180474 np0035104604 nova-compute[107505]: yes Aug 30 14:01:14.180474 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.180474 np0035104604 nova-compute[107505]: Aug 30 14:01:14.180474 np0035104604 nova-compute[107505]: Aug 30 14:01:14.180474 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.180474 np0035104604 nova-compute[107505]: Aug 30 14:01:14.180474 np0035104604 nova-compute[107505]: Aug 30 14:01:14.180474 np0035104604 nova-compute[107505]: Aug 30 14:01:14.180474 np0035104604 nova-compute[107505]: Aug 30 14:01:14.180474 np0035104604 nova-compute[107505]: Aug 30 14:01:14.180474 np0035104604 nova-compute[107505]: Aug 30 14:01:14.180474 np0035104604 nova-compute[107505]: Aug 30 14:01:14.180474 np0035104604 nova-compute[107505]: on Aug 30 14:01:14.180474 np0035104604 nova-compute[107505]: off Aug 30 14:01:14.180474 np0035104604 nova-compute[107505]: Aug 30 14:01:14.180474 np0035104604 nova-compute[107505]: Aug 30 14:01:14.180474 np0035104604 nova-compute[107505]: Aug 30 14:01:14.180474 np0035104604 nova-compute[107505]: EPYC Aug 30 14:01:14.180474 np0035104604 nova-compute[107505]: AMD Aug 30 14:01:14.180474 np0035104604 nova-compute[107505]: Aug 30 14:01:14.180474 np0035104604 nova-compute[107505]: Aug 30 14:01:14.180474 np0035104604 nova-compute[107505]: Aug 30 14:01:14.180474 np0035104604 nova-compute[107505]: Aug 30 14:01:14.180474 np0035104604 nova-compute[107505]: Aug 30 14:01:14.180474 np0035104604 nova-compute[107505]: Aug 30 14:01:14.180474 np0035104604 nova-compute[107505]: Aug 30 14:01:14.180474 np0035104604 nova-compute[107505]: Aug 30 14:01:14.180474 np0035104604 nova-compute[107505]: Aug 30 14:01:14.180474 np0035104604 nova-compute[107505]: Aug 30 14:01:14.180474 np0035104604 nova-compute[107505]: Aug 30 14:01:14.180474 np0035104604 nova-compute[107505]: Aug 30 14:01:14.180474 np0035104604 nova-compute[107505]: Aug 30 14:01:14.183384 np0035104604 nova-compute[107505]: Aug 30 14:01:14.183384 np0035104604 nova-compute[107505]: Aug 30 14:01:14.183384 np0035104604 nova-compute[107505]: Aug 30 14:01:14.183384 np0035104604 nova-compute[107505]: Aug 30 14:01:14.183384 np0035104604 nova-compute[107505]: Aug 30 14:01:14.183384 np0035104604 nova-compute[107505]: Aug 30 14:01:14.183384 np0035104604 nova-compute[107505]: Aug 30 14:01:14.183384 np0035104604 nova-compute[107505]: Aug 30 14:01:14.183384 np0035104604 nova-compute[107505]: Aug 30 14:01:14.183384 np0035104604 nova-compute[107505]: Aug 30 14:01:14.183384 np0035104604 nova-compute[107505]: Aug 30 14:01:14.183384 np0035104604 nova-compute[107505]: Aug 30 14:01:14.183384 np0035104604 nova-compute[107505]: Aug 30 14:01:14.183384 np0035104604 nova-compute[107505]: Aug 30 14:01:14.183384 np0035104604 nova-compute[107505]: Aug 30 14:01:14.183384 np0035104604 nova-compute[107505]: Aug 30 14:01:14.183384 np0035104604 nova-compute[107505]: Aug 30 14:01:14.183384 np0035104604 nova-compute[107505]: qemu64 Aug 30 14:01:14.183384 np0035104604 nova-compute[107505]: qemu32 Aug 30 14:01:14.183384 np0035104604 nova-compute[107505]: phenom Aug 30 14:01:14.183384 np0035104604 nova-compute[107505]: pentium3 Aug 30 14:01:14.183384 np0035104604 nova-compute[107505]: pentium2 Aug 30 14:01:14.183384 np0035104604 nova-compute[107505]: pentium Aug 30 14:01:14.183384 np0035104604 nova-compute[107505]: n270 Aug 30 14:01:14.183384 np0035104604 nova-compute[107505]: kvm64 Aug 30 14:01:14.183384 np0035104604 nova-compute[107505]: kvm32 Aug 30 14:01:14.183384 np0035104604 nova-compute[107505]: coreduo Aug 30 14:01:14.183384 np0035104604 nova-compute[107505]: core2duo Aug 30 14:01:14.183384 np0035104604 nova-compute[107505]: athlon Aug 30 14:01:14.183384 np0035104604 nova-compute[107505]: Westmere-IBRS Aug 30 14:01:14.183384 np0035104604 nova-compute[107505]: Westmere Aug 30 14:01:14.183384 np0035104604 nova-compute[107505]: Snowridge Aug 30 14:01:14.183384 np0035104604 nova-compute[107505]: Skylake-Server-noTSX-IBRS Aug 30 14:01:14.183384 np0035104604 nova-compute[107505]: Skylake-Server-IBRS Aug 30 14:01:14.183384 np0035104604 nova-compute[107505]: Skylake-Server Aug 30 14:01:14.183384 np0035104604 nova-compute[107505]: Skylake-Client-noTSX-IBRS Aug 30 14:01:14.183384 np0035104604 nova-compute[107505]: Skylake-Client-IBRS Aug 30 14:01:14.183384 np0035104604 nova-compute[107505]: Skylake-Client Aug 30 14:01:14.183384 np0035104604 nova-compute[107505]: SandyBridge-IBRS Aug 30 14:01:14.183384 np0035104604 nova-compute[107505]: SandyBridge Aug 30 14:01:14.183384 np0035104604 nova-compute[107505]: Penryn Aug 30 14:01:14.183384 np0035104604 nova-compute[107505]: Opteron_G5 Aug 30 14:01:14.183384 np0035104604 nova-compute[107505]: Opteron_G4 Aug 30 14:01:14.183384 np0035104604 nova-compute[107505]: Opteron_G3 Aug 30 14:01:14.183384 np0035104604 nova-compute[107505]: Opteron_G2 Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: Opteron_G1 Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: Nehalem-IBRS Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: Nehalem Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: IvyBridge-IBRS Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: IvyBridge Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: Icelake-Server-noTSX Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: Icelake-Server Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: Icelake-Client-noTSX Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: Icelake-Client Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: Haswell-noTSX-IBRS Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: Haswell-noTSX Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: Haswell-IBRS Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: Haswell Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: EPYC-Rome Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: EPYC-Milan Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: EPYC-IBPB Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: EPYC Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: Dhyana Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: Cooperlake Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: Conroe Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: Cascadelake-Server-noTSX Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: Cascadelake-Server Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: Broadwell-noTSX-IBRS Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: Broadwell-noTSX Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: Broadwell-IBRS Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: Broadwell Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: 486 Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: file Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: anonymous Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: memfd Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: disk Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: cdrom Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: floppy Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: lun Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: fdc Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: usb Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: sata Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: Aug 30 14:01:14.185093 np0035104604 nova-compute[107505]: Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: sdl Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: vnc Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: spice Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: egl-headless Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: subsystem Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: default Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: mandatory Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: requisite Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: optional Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: usb Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: pci Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: random Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: egd Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: builtin Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: path Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: handle Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: virtiofs Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: tpm-tis Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: tpm-crb Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: passthrough Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: Aug 30 14:01:14.187951 np0035104604 nova-compute[107505]: Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: {{(pid=107505) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc: Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-x86_64 Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: qemu Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: pc-i440fx-6.2 Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: x86_64 Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: rom Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: pflash Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: yes Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: on Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: off Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: EPYC Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: AMD Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: Aug 30 14:01:14.191176 np0035104604 nova-compute[107505]: Aug 30 14:01:14.194338 np0035104604 nova-compute[107505]: Aug 30 14:01:14.194338 np0035104604 nova-compute[107505]: Aug 30 14:01:14.194338 np0035104604 nova-compute[107505]: Aug 30 14:01:14.194338 np0035104604 nova-compute[107505]: Aug 30 14:01:14.194338 np0035104604 nova-compute[107505]: Aug 30 14:01:14.194338 np0035104604 nova-compute[107505]: Aug 30 14:01:14.194338 np0035104604 nova-compute[107505]: Aug 30 14:01:14.194338 np0035104604 nova-compute[107505]: Aug 30 14:01:14.194338 np0035104604 nova-compute[107505]: Aug 30 14:01:14.194338 np0035104604 nova-compute[107505]: Aug 30 14:01:14.194338 np0035104604 nova-compute[107505]: Aug 30 14:01:14.194338 np0035104604 nova-compute[107505]: Aug 30 14:01:14.194338 np0035104604 nova-compute[107505]: Aug 30 14:01:14.194338 np0035104604 nova-compute[107505]: qemu64 Aug 30 14:01:14.194338 np0035104604 nova-compute[107505]: qemu32 Aug 30 14:01:14.194338 np0035104604 nova-compute[107505]: phenom Aug 30 14:01:14.194338 np0035104604 nova-compute[107505]: pentium3 Aug 30 14:01:14.194338 np0035104604 nova-compute[107505]: pentium2 Aug 30 14:01:14.194338 np0035104604 nova-compute[107505]: pentium Aug 30 14:01:14.194338 np0035104604 nova-compute[107505]: n270 Aug 30 14:01:14.194338 np0035104604 nova-compute[107505]: kvm64 Aug 30 14:01:14.194338 np0035104604 nova-compute[107505]: kvm32 Aug 30 14:01:14.194338 np0035104604 nova-compute[107505]: coreduo Aug 30 14:01:14.194338 np0035104604 nova-compute[107505]: core2duo Aug 30 14:01:14.194338 np0035104604 nova-compute[107505]: athlon Aug 30 14:01:14.194338 np0035104604 nova-compute[107505]: Westmere-IBRS Aug 30 14:01:14.194338 np0035104604 nova-compute[107505]: Westmere Aug 30 14:01:14.194338 np0035104604 nova-compute[107505]: Snowridge Aug 30 14:01:14.194338 np0035104604 nova-compute[107505]: Skylake-Server-noTSX-IBRS Aug 30 14:01:14.194338 np0035104604 nova-compute[107505]: Skylake-Server-IBRS Aug 30 14:01:14.194338 np0035104604 nova-compute[107505]: Skylake-Server Aug 30 14:01:14.194338 np0035104604 nova-compute[107505]: Skylake-Client-noTSX-IBRS Aug 30 14:01:14.194338 np0035104604 nova-compute[107505]: Skylake-Client-IBRS Aug 30 14:01:14.194338 np0035104604 nova-compute[107505]: Skylake-Client Aug 30 14:01:14.194338 np0035104604 nova-compute[107505]: SandyBridge-IBRS Aug 30 14:01:14.194338 np0035104604 nova-compute[107505]: SandyBridge Aug 30 14:01:14.194338 np0035104604 nova-compute[107505]: Penryn Aug 30 14:01:14.194338 np0035104604 nova-compute[107505]: Opteron_G5 Aug 30 14:01:14.194338 np0035104604 nova-compute[107505]: Opteron_G4 Aug 30 14:01:14.194338 np0035104604 nova-compute[107505]: Opteron_G3 Aug 30 14:01:14.194338 np0035104604 nova-compute[107505]: Opteron_G2 Aug 30 14:01:14.194338 np0035104604 nova-compute[107505]: Opteron_G1 Aug 30 14:01:14.194338 np0035104604 nova-compute[107505]: Nehalem-IBRS Aug 30 14:01:14.194338 np0035104604 nova-compute[107505]: Nehalem Aug 30 14:01:14.194338 np0035104604 nova-compute[107505]: IvyBridge-IBRS Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: IvyBridge Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: Icelake-Server-noTSX Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: Icelake-Server Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: Icelake-Client-noTSX Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: Icelake-Client Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: Haswell-noTSX-IBRS Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: Haswell-noTSX Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: Haswell-IBRS Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: Haswell Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: EPYC-Rome Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: EPYC-Milan Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: EPYC-IBPB Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: EPYC Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: Dhyana Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: Cooperlake Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: Conroe Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: Cascadelake-Server-noTSX Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: Cascadelake-Server Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: Broadwell-noTSX-IBRS Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: Broadwell-noTSX Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: Broadwell-IBRS Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: Broadwell Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: 486 Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: file Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: anonymous Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: memfd Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: disk Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: cdrom Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: floppy Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: lun Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: ide Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: fdc Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: usb Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: sata Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: sdl Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: vnc Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: spice Aug 30 14:01:14.196310 np0035104604 nova-compute[107505]: egl-headless Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: subsystem Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: default Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: mandatory Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: requisite Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: optional Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: usb Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: pci Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: random Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: egd Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: builtin Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: path Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: handle Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: virtiofs Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: tpm-tis Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: tpm-crb Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: passthrough Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: Aug 30 14:01:14.199630 np0035104604 nova-compute[107505]: {{(pid=107505) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=ubuntu: Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-x86_64 Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: qemu Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: pc-i440fx-jammy Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: x86_64 Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: rom Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: pflash Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: yes Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: on Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: off Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: EPYC Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: AMD Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: Aug 30 14:01:14.202749 np0035104604 nova-compute[107505]: Aug 30 14:01:14.204922 np0035104604 nova-compute[107505]: Aug 30 14:01:14.204922 np0035104604 nova-compute[107505]: Aug 30 14:01:14.204922 np0035104604 nova-compute[107505]: Aug 30 14:01:14.204922 np0035104604 nova-compute[107505]: Aug 30 14:01:14.204922 np0035104604 nova-compute[107505]: Aug 30 14:01:14.204922 np0035104604 nova-compute[107505]: Aug 30 14:01:14.204922 np0035104604 nova-compute[107505]: Aug 30 14:01:14.204922 np0035104604 nova-compute[107505]: Aug 30 14:01:14.204922 np0035104604 nova-compute[107505]: Aug 30 14:01:14.204922 np0035104604 nova-compute[107505]: Aug 30 14:01:14.204922 np0035104604 nova-compute[107505]: qemu64 Aug 30 14:01:14.204922 np0035104604 nova-compute[107505]: qemu32 Aug 30 14:01:14.204922 np0035104604 nova-compute[107505]: phenom Aug 30 14:01:14.204922 np0035104604 nova-compute[107505]: pentium3 Aug 30 14:01:14.204922 np0035104604 nova-compute[107505]: pentium2 Aug 30 14:01:14.204922 np0035104604 nova-compute[107505]: pentium Aug 30 14:01:14.204922 np0035104604 nova-compute[107505]: n270 Aug 30 14:01:14.204922 np0035104604 nova-compute[107505]: kvm64 Aug 30 14:01:14.204922 np0035104604 nova-compute[107505]: kvm32 Aug 30 14:01:14.204922 np0035104604 nova-compute[107505]: coreduo Aug 30 14:01:14.204922 np0035104604 nova-compute[107505]: core2duo Aug 30 14:01:14.204922 np0035104604 nova-compute[107505]: athlon Aug 30 14:01:14.204922 np0035104604 nova-compute[107505]: Westmere-IBRS Aug 30 14:01:14.204922 np0035104604 nova-compute[107505]: Westmere Aug 30 14:01:14.204922 np0035104604 nova-compute[107505]: Snowridge Aug 30 14:01:14.204922 np0035104604 nova-compute[107505]: Skylake-Server-noTSX-IBRS Aug 30 14:01:14.204922 np0035104604 nova-compute[107505]: Skylake-Server-IBRS Aug 30 14:01:14.204922 np0035104604 nova-compute[107505]: Skylake-Server Aug 30 14:01:14.204922 np0035104604 nova-compute[107505]: Skylake-Client-noTSX-IBRS Aug 30 14:01:14.204922 np0035104604 nova-compute[107505]: Skylake-Client-IBRS Aug 30 14:01:14.204922 np0035104604 nova-compute[107505]: Skylake-Client Aug 30 14:01:14.204922 np0035104604 nova-compute[107505]: SandyBridge-IBRS Aug 30 14:01:14.204922 np0035104604 nova-compute[107505]: SandyBridge Aug 30 14:01:14.204922 np0035104604 nova-compute[107505]: Penryn Aug 30 14:01:14.204922 np0035104604 nova-compute[107505]: Opteron_G5 Aug 30 14:01:14.204922 np0035104604 nova-compute[107505]: Opteron_G4 Aug 30 14:01:14.204922 np0035104604 nova-compute[107505]: Opteron_G3 Aug 30 14:01:14.204922 np0035104604 nova-compute[107505]: Opteron_G2 Aug 30 14:01:14.204922 np0035104604 nova-compute[107505]: Opteron_G1 Aug 30 14:01:14.204922 np0035104604 nova-compute[107505]: Nehalem-IBRS Aug 30 14:01:14.204922 np0035104604 nova-compute[107505]: Nehalem Aug 30 14:01:14.204922 np0035104604 nova-compute[107505]: IvyBridge-IBRS Aug 30 14:01:14.204922 np0035104604 nova-compute[107505]: IvyBridge Aug 30 14:01:14.204922 np0035104604 nova-compute[107505]: Icelake-Server-noTSX Aug 30 14:01:14.204922 np0035104604 nova-compute[107505]: Icelake-Server Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: Icelake-Client-noTSX Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: Icelake-Client Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: Haswell-noTSX-IBRS Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: Haswell-noTSX Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: Haswell-IBRS Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: Haswell Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: EPYC-Rome Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: EPYC-Milan Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: EPYC-IBPB Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: EPYC Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: Dhyana Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: Cooperlake Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: Conroe Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: Cascadelake-Server-noTSX Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: Cascadelake-Server Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: Broadwell-noTSX-IBRS Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: Broadwell-noTSX Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: Broadwell-IBRS Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: Broadwell Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: 486 Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: file Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: anonymous Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: memfd Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: disk Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: cdrom Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: floppy Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: lun Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: ide Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: fdc Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: usb Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: sata Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: sdl Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: vnc Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: spice Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: egl-headless Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: Aug 30 14:01:14.206525 np0035104604 nova-compute[107505]: Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: subsystem Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: default Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: mandatory Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: requisite Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: optional Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: usb Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: pci Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: random Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: egd Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: builtin Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: path Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: handle Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: virtiofs Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: tpm-tis Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: tpm-crb Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: passthrough Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: {{(pid=107505) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35: Aug 30 14:01:14.209003 np0035104604 nova-compute[107505]: Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-x86_64 Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: qemu Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: pc-q35-6.2 Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: x86_64 Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: rom Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: pflash Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: yes Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: on Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: off Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: EPYC Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: AMD Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: Aug 30 14:01:14.213264 np0035104604 nova-compute[107505]: Aug 30 14:01:14.220495 np0035104604 nova-compute[107505]: Aug 30 14:01:14.220495 np0035104604 nova-compute[107505]: Aug 30 14:01:14.220495 np0035104604 nova-compute[107505]: Aug 30 14:01:14.220495 np0035104604 nova-compute[107505]: Aug 30 14:01:14.220495 np0035104604 nova-compute[107505]: Aug 30 14:01:14.220495 np0035104604 nova-compute[107505]: Aug 30 14:01:14.220495 np0035104604 nova-compute[107505]: qemu64 Aug 30 14:01:14.220495 np0035104604 nova-compute[107505]: qemu32 Aug 30 14:01:14.220495 np0035104604 nova-compute[107505]: phenom Aug 30 14:01:14.220495 np0035104604 nova-compute[107505]: pentium3 Aug 30 14:01:14.220495 np0035104604 nova-compute[107505]: pentium2 Aug 30 14:01:14.220495 np0035104604 nova-compute[107505]: pentium Aug 30 14:01:14.220495 np0035104604 nova-compute[107505]: n270 Aug 30 14:01:14.220495 np0035104604 nova-compute[107505]: kvm64 Aug 30 14:01:14.220495 np0035104604 nova-compute[107505]: kvm32 Aug 30 14:01:14.220495 np0035104604 nova-compute[107505]: coreduo Aug 30 14:01:14.220495 np0035104604 nova-compute[107505]: core2duo Aug 30 14:01:14.220495 np0035104604 nova-compute[107505]: athlon Aug 30 14:01:14.220495 np0035104604 nova-compute[107505]: Westmere-IBRS Aug 30 14:01:14.220495 np0035104604 nova-compute[107505]: Westmere Aug 30 14:01:14.220495 np0035104604 nova-compute[107505]: Snowridge Aug 30 14:01:14.220495 np0035104604 nova-compute[107505]: Skylake-Server-noTSX-IBRS Aug 30 14:01:14.220495 np0035104604 nova-compute[107505]: Skylake-Server-IBRS Aug 30 14:01:14.220495 np0035104604 nova-compute[107505]: Skylake-Server Aug 30 14:01:14.220495 np0035104604 nova-compute[107505]: Skylake-Client-noTSX-IBRS Aug 30 14:01:14.220495 np0035104604 nova-compute[107505]: Skylake-Client-IBRS Aug 30 14:01:14.220495 np0035104604 nova-compute[107505]: Skylake-Client Aug 30 14:01:14.220495 np0035104604 nova-compute[107505]: SandyBridge-IBRS Aug 30 14:01:14.220495 np0035104604 nova-compute[107505]: SandyBridge Aug 30 14:01:14.220495 np0035104604 nova-compute[107505]: Penryn Aug 30 14:01:14.220495 np0035104604 nova-compute[107505]: Opteron_G5 Aug 30 14:01:14.220495 np0035104604 nova-compute[107505]: Opteron_G4 Aug 30 14:01:14.220495 np0035104604 nova-compute[107505]: Opteron_G3 Aug 30 14:01:14.220495 np0035104604 nova-compute[107505]: Opteron_G2 Aug 30 14:01:14.220495 np0035104604 nova-compute[107505]: Opteron_G1 Aug 30 14:01:14.220495 np0035104604 nova-compute[107505]: Nehalem-IBRS Aug 30 14:01:14.220495 np0035104604 nova-compute[107505]: Nehalem Aug 30 14:01:14.220495 np0035104604 nova-compute[107505]: IvyBridge-IBRS Aug 30 14:01:14.220495 np0035104604 nova-compute[107505]: IvyBridge Aug 30 14:01:14.220495 np0035104604 nova-compute[107505]: Icelake-Server-noTSX Aug 30 14:01:14.220495 np0035104604 nova-compute[107505]: Icelake-Server Aug 30 14:01:14.220495 np0035104604 nova-compute[107505]: Icelake-Client-noTSX Aug 30 14:01:14.220495 np0035104604 nova-compute[107505]: Icelake-Client Aug 30 14:01:14.220495 np0035104604 nova-compute[107505]: Haswell-noTSX-IBRS Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: Haswell-noTSX Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: Haswell-IBRS Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: Haswell Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: EPYC-Rome Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: EPYC-Milan Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: EPYC-IBPB Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: EPYC Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: Dhyana Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: Cooperlake Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: Conroe Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: Cascadelake-Server-noTSX Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: Cascadelake-Server Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: Broadwell-noTSX-IBRS Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: Broadwell-noTSX Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: Broadwell-IBRS Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: Broadwell Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: 486 Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: file Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: anonymous Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: memfd Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: disk Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: cdrom Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: floppy Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: lun Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: fdc Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: usb Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: sata Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: sdl Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: vnc Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: spice Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: egl-headless Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: Aug 30 14:01:14.223154 np0035104604 nova-compute[107505]: Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: subsystem Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: default Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: mandatory Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: requisite Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: optional Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: usb Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: pci Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: random Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: egd Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: builtin Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: path Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: handle Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: virtiofs Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: tpm-tis Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: tpm-crb Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: passthrough Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: {{(pid=107505) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Aug 30 14:01:14.231644 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Getting domain capabilities for xtensa via machine types: {None} {{(pid=107505) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Libvirt host hypervisor capabilities for arch=xtensa and machine_type=None: Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-xtensa Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: qemu Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: sim Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: xtensa Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: rom Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: pflash Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: yes Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: file Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: anonymous Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: memfd Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: disk Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: cdrom Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: floppy Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: lun Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: fdc Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: usb Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: sata Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: sdl Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: vnc Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: spice Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: egl-headless Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: Aug 30 14:01:14.235743 np0035104604 nova-compute[107505]: Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: subsystem Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: default Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: mandatory Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: requisite Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: optional Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: usb Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: pci Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: random Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: egd Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: builtin Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: path Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: handle Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: virtiofs Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: {{(pid=107505) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Getting domain capabilities for xtensaeb via machine types: {None} {{(pid=107505) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Libvirt host hypervisor capabilities for arch=xtensaeb and machine_type=None: Aug 30 14:01:14.245350 np0035104604 nova-compute[107505]: Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: /usr/bin/qemu-system-xtensaeb Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: qemu Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: sim Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: xtensaeb Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: rom Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: pflash Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: yes Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: no Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: file Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: anonymous Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: memfd Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: disk Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: cdrom Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: floppy Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: lun Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: fdc Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: usb Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: sata Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: sdl Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: vnc Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: spice Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: egl-headless Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: subsystem Aug 30 14:01:14.249801 np0035104604 nova-compute[107505]: Aug 30 14:01:14.255800 np0035104604 nova-compute[107505]: Aug 30 14:01:14.255800 np0035104604 nova-compute[107505]: default Aug 30 14:01:14.255800 np0035104604 nova-compute[107505]: mandatory Aug 30 14:01:14.255800 np0035104604 nova-compute[107505]: requisite Aug 30 14:01:14.255800 np0035104604 nova-compute[107505]: optional Aug 30 14:01:14.255800 np0035104604 nova-compute[107505]: Aug 30 14:01:14.255800 np0035104604 nova-compute[107505]: Aug 30 14:01:14.255800 np0035104604 nova-compute[107505]: usb Aug 30 14:01:14.255800 np0035104604 nova-compute[107505]: pci Aug 30 14:01:14.255800 np0035104604 nova-compute[107505]: scsi Aug 30 14:01:14.255800 np0035104604 nova-compute[107505]: Aug 30 14:01:14.255800 np0035104604 nova-compute[107505]: Aug 30 14:01:14.255800 np0035104604 nova-compute[107505]: Aug 30 14:01:14.255800 np0035104604 nova-compute[107505]: Aug 30 14:01:14.255800 np0035104604 nova-compute[107505]: Aug 30 14:01:14.255800 np0035104604 nova-compute[107505]: Aug 30 14:01:14.255800 np0035104604 nova-compute[107505]: virtio Aug 30 14:01:14.255800 np0035104604 nova-compute[107505]: virtio-transitional Aug 30 14:01:14.255800 np0035104604 nova-compute[107505]: virtio-non-transitional Aug 30 14:01:14.255800 np0035104604 nova-compute[107505]: Aug 30 14:01:14.255800 np0035104604 nova-compute[107505]: Aug 30 14:01:14.255800 np0035104604 nova-compute[107505]: random Aug 30 14:01:14.255800 np0035104604 nova-compute[107505]: egd Aug 30 14:01:14.255800 np0035104604 nova-compute[107505]: builtin Aug 30 14:01:14.255800 np0035104604 nova-compute[107505]: Aug 30 14:01:14.255800 np0035104604 nova-compute[107505]: Aug 30 14:01:14.255800 np0035104604 nova-compute[107505]: Aug 30 14:01:14.255800 np0035104604 nova-compute[107505]: Aug 30 14:01:14.255800 np0035104604 nova-compute[107505]: path Aug 30 14:01:14.255800 np0035104604 nova-compute[107505]: handle Aug 30 14:01:14.255800 np0035104604 nova-compute[107505]: virtiofs Aug 30 14:01:14.255800 np0035104604 nova-compute[107505]: Aug 30 14:01:14.255800 np0035104604 nova-compute[107505]: Aug 30 14:01:14.255800 np0035104604 nova-compute[107505]: Aug 30 14:01:14.255800 np0035104604 nova-compute[107505]: Aug 30 14:01:14.255800 np0035104604 nova-compute[107505]: Aug 30 14:01:14.255800 np0035104604 nova-compute[107505]: Aug 30 14:01:14.255800 np0035104604 nova-compute[107505]: Aug 30 14:01:14.255800 np0035104604 nova-compute[107505]: Aug 30 14:01:14.255800 np0035104604 nova-compute[107505]: Aug 30 14:01:14.255800 np0035104604 nova-compute[107505]: Aug 30 14:01:14.255800 np0035104604 nova-compute[107505]: Aug 30 14:01:14.255800 np0035104604 nova-compute[107505]: Aug 30 14:01:14.255800 np0035104604 nova-compute[107505]: Aug 30 14:01:14.255800 np0035104604 nova-compute[107505]: {{(pid=107505) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Aug 30 14:01:14.255800 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Checking secure boot support for host arch (x86_64) {{(pid=107505) supports_secure_boot /opt/stack/nova/nova/virt/libvirt/host.py:1779}} Aug 30 14:01:14.255800 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Checking secure boot support for host arch (x86_64) {{(pid=107505) supports_secure_boot /opt/stack/nova/nova/virt/libvirt/host.py:1779}} Aug 30 14:01:14.259928 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Checking secure boot support for host arch (x86_64) {{(pid=107505) supports_secure_boot /opt/stack/nova/nova/virt/libvirt/host.py:1779}} Aug 30 14:01:14.259928 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Checking secure boot support for host arch (x86_64) {{(pid=107505) supports_secure_boot /opt/stack/nova/nova/virt/libvirt/host.py:1779}} Aug 30 14:01:14.259928 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Checking secure boot support for host arch (x86_64) {{(pid=107505) supports_secure_boot /opt/stack/nova/nova/virt/libvirt/host.py:1779}} Aug 30 14:01:14.259928 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Checking secure boot support for host arch (x86_64) {{(pid=107505) supports_secure_boot /opt/stack/nova/nova/virt/libvirt/host.py:1779}} Aug 30 14:01:14.259928 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Checking secure boot support for host arch (x86_64) {{(pid=107505) supports_secure_boot /opt/stack/nova/nova/virt/libvirt/host.py:1779}} Aug 30 14:01:14.259928 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Checking secure boot support for host arch (x86_64) {{(pid=107505) supports_secure_boot /opt/stack/nova/nova/virt/libvirt/host.py:1779}} Aug 30 14:01:14.259928 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] No Secure Boot support detected {{(pid=107505) supports_secure_boot /opt/stack/nova/nova/virt/libvirt/host.py:1789}} Aug 30 14:01:14.260532 np0035104604 nova-compute[107505]: INFO nova.virt.node [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Generated node identity 600ab55f-530c-4be6-bf02-067d68ce7ee4 Aug 30 14:01:14.260532 np0035104604 nova-compute[107505]: INFO nova.virt.node [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Wrote node identity 600ab55f-530c-4be6-bf02-067d68ce7ee4 to /opt/stack/data/nova/compute_id Aug 30 14:01:14.260532 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Compute nodes ['600ab55f-530c-4be6-bf02-067d68ce7ee4'] for host np0035104604 were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. Aug 30 14:01:14.267334 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host Aug 30 14:01:14.304676 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] No compute node record found for host np0035104604. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host np0035104604 could not be found. Aug 30 14:01:14.305061 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:01:14.305475 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:01:14.305874 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:01:14.306329 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Auditing locally available compute resources for np0035104604 (node: np0035104604) {{(pid=107505) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} Aug 30 14:01:14.307083 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:01:14.958353 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.650s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:01:15.043336 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Aug 30 14:01:15.045837 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Hypervisor/Node resource view: name=np0035104604 free_ram=2452MB free_disk=29.99609375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "00b8", "vendor_id": "1013", "numa_node": null, "label": "label_1013_00b8", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] {{(pid=107505) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} Aug 30 14:01:15.046234 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:01:15.046839 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:01:15.083882 np0035104604 nova-compute[107505]: WARNING nova.compute.resource_tracker [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] No compute node record for np0035104604:600ab55f-530c-4be6-bf02-067d68ce7ee4: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 600ab55f-530c-4be6-bf02-067d68ce7ee4 could not be found. Aug 30 14:01:15.114153 np0035104604 nova-compute[107505]: INFO nova.compute.resource_tracker [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Compute node record created for np0035104604:np0035104604 with uuid: 600ab55f-530c-4be6-bf02-067d68ce7ee4 Aug 30 14:01:15.241689 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Total usable vcpus: 8, total allocated vcpus: 0 {{(pid=107505) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} Aug 30 14:01:15.242046 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Final resource view: name=np0035104604 phys_ram=7746MB used_ram=512MB phys_disk=29GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] {{(pid=107505) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} Aug 30 14:01:16.127444 np0035104604 nova-compute[107505]: INFO nova.scheduler.client.report [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [req-833548af-217e-43ac-9de8-16376ead666d] Created resource provider record via placement API for resource provider with UUID 600ab55f-530c-4be6-bf02-067d68ce7ee4 and name np0035104604. Aug 30 14:01:16.300779 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:01:17.116970 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.816s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:01:17.122689 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] /sys/module/kvm_amd/parameters/sev does not exist {{(pid=107505) _kernel_supports_amd_sev /opt/stack/nova/nova/virt/libvirt/host.py:1795}} Aug 30 14:01:17.122954 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] kernel doesn't support AMD SEV Aug 30 14:01:17.123855 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Updating inventory in ProviderTree for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 with inventory: {'MEMORY_MB': {'total': 7746, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 29, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 0}} {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Aug 30 14:01:17.124439 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=107505) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5390}} Aug 30 14:01:17.130997 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Libvirt baseline CPU Aug 30 14:01:17.130997 np0035104604 nova-compute[107505]: x86_64 Aug 30 14:01:17.130997 np0035104604 nova-compute[107505]: Nehalem Aug 30 14:01:17.130997 np0035104604 nova-compute[107505]: Intel Aug 30 14:01:17.130997 np0035104604 nova-compute[107505]: Aug 30 14:01:17.130997 np0035104604 nova-compute[107505]: Aug 30 14:01:17.130997 np0035104604 nova-compute[107505]: {{(pid=107505) _get_guest_baseline_cpu_features /opt/stack/nova/nova/virt/libvirt/driver.py:12513}} Aug 30 14:01:17.328988 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Updated inventory for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7746, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 29, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} Aug 30 14:01:17.329439 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Updating resource provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 generation from 0 to 1 during operation: update_inventory {{(pid=107505) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} Aug 30 14:01:17.329811 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Updating inventory in ProviderTree for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 with inventory: {'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Aug 30 14:01:17.785806 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Updating resource provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 generation from 1 to 2 during operation: update_traits {{(pid=107505) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} Aug 30 14:01:17.822981 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Compute_service record updated for np0035104604:np0035104604 {{(pid=107505) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} Aug 30 14:01:17.823327 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.777s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:01:17.823597 np0035104604 nova-compute[107505]: DEBUG nova.service [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Creating RPC server for service compute {{(pid=107505) start /opt/stack/nova/nova/service.py:182}} Aug 30 14:01:17.859081 np0035104604 nova-compute[107505]: DEBUG nova.service [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Join ServiceGroup membership for this service compute {{(pid=107505) start /opt/stack/nova/nova/service.py:199}} Aug 30 14:01:17.859642 np0035104604 nova-compute[107505]: DEBUG nova.servicegroup.drivers.db [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] DB_Driver: join new ServiceGroup member np0035104604 to the compute group, service = {{(pid=107505) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} Aug 30 14:01:40.863892 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._sync_power_states {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:01:40.884586 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:02:05.029218 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:02:05.030474 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:02:05.031006 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Starting heal instance info cache {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9842}} Aug 30 14:02:05.031423 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Rebuilding the list of instances to heal {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9846}} Aug 30 14:02:05.050984 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Didn't find any instances for network info cache update. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9928}} Aug 30 14:02:05.052165 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:02:05.052917 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:02:05.053653 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:02:05.054678 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:02:05.055424 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:02:05.056191 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:02:05.056817 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=107505) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10461}} Aug 30 14:02:05.057361 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager.update_available_resource {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:02:05.095708 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:02:05.096844 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:02:05.097422 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:02:05.097876 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Auditing locally available compute resources for np0035104604 (node: np0035104604) {{(pid=107505) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} Aug 30 14:02:05.098757 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:02:05.688543 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.589s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:02:05.770664 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Aug 30 14:02:05.772262 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Hypervisor/Node resource view: name=np0035104604 free_ram=1846MB free_disk=29.99609375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "00b8", "vendor_id": "1013", "numa_node": null, "label": "label_1013_00b8", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] {{(pid=107505) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} Aug 30 14:02:05.772554 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:02:05.772958 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:02:05.995190 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Total usable vcpus: 8, total allocated vcpus: 0 {{(pid=107505) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} Aug 30 14:02:05.995524 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Final resource view: name=np0035104604 phys_ram=7746MB used_ram=512MB phys_disk=29GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] {{(pid=107505) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} Aug 30 14:02:06.143048 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:02:06.700008 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.556s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:02:06.706698 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:02:06.721003 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:02:06.725171 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Compute_service record updated for np0035104604:np0035104604 {{(pid=107505) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} Aug 30 14:02:06.725661 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.953s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:03:06.714390 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:03:06.714965 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:03:06.715189 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:03:06.715542 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:03:06.715911 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:03:06.716245 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:03:06.716594 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:03:06.716992 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=107505) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10461}} Aug 30 14:03:06.717301 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager.update_available_resource {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:03:06.750166 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:03:06.750710 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:03:06.751086 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:03:06.751460 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Auditing locally available compute resources for np0035104604 (node: np0035104604) {{(pid=107505) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} Aug 30 14:03:06.752109 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:03:07.422754 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.670s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:03:07.503529 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Aug 30 14:03:07.505568 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Hypervisor/Node resource view: name=np0035104604 free_ram=1897MB free_disk=29.99609375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "00b8", "vendor_id": "1013", "numa_node": null, "label": "label_1013_00b8", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] {{(pid=107505) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} Aug 30 14:03:07.506124 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:03:07.506565 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:03:07.745051 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Total usable vcpus: 8, total allocated vcpus: 0 {{(pid=107505) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} Aug 30 14:03:07.745501 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Final resource view: name=np0035104604 phys_ram=7746MB used_ram=512MB phys_disk=29GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] {{(pid=107505) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} Aug 30 14:03:07.896273 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:03:08.626081 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.729s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:03:08.631887 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:03:08.648487 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:03:08.652716 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Compute_service record updated for np0035104604:np0035104604 {{(pid=107505) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} Aug 30 14:03:08.653154 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.146s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:03:08.964637 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:03:08.965252 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Starting heal instance info cache {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9842}} Aug 30 14:03:08.965510 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Rebuilding the list of instances to heal {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9846}} Aug 30 14:03:08.979145 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Didn't find any instances for network info cache update. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9928}} Aug 30 14:03:08.979424 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:04:05.027412 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:04:05.028288 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager.update_available_resource {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:04:05.054610 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:04:05.055139 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:04:05.055574 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:04:05.057053 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Auditing locally available compute resources for np0035104604 (node: np0035104604) {{(pid=107505) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} Aug 30 14:04:05.057547 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:04:05.834447 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.776s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:04:05.916655 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Aug 30 14:04:05.918488 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Hypervisor/Node resource view: name=np0035104604 free_ram=1920MB free_disk=29.99609375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "00b8", "vendor_id": "1013", "numa_node": null, "label": "label_1013_00b8", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] {{(pid=107505) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} Aug 30 14:04:05.918854 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:04:05.919360 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:04:06.112892 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Total usable vcpus: 8, total allocated vcpus: 0 {{(pid=107505) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} Aug 30 14:04:06.113317 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Final resource view: name=np0035104604 phys_ram=7746MB used_ram=512MB phys_disk=29GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] {{(pid=107505) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} Aug 30 14:04:06.273109 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:04:06.997869 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.724s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:04:07.004385 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:04:07.023150 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:04:07.029622 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Compute_service record updated for np0035104604:np0035104604 {{(pid=107505) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} Aug 30 14:04:07.030501 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.111s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:04:08.029359 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:04:08.030526 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:04:08.030973 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Starting heal instance info cache {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9842}} Aug 30 14:04:08.031288 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Rebuilding the list of instances to heal {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9846}} Aug 30 14:04:08.044428 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Didn't find any instances for network info cache update. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9928}} Aug 30 14:04:08.045751 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:04:08.046370 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:04:08.046897 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:04:08.047430 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:04:08.047907 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=107505) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10461}} Aug 30 14:04:09.028179 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:05:05.027520 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager.update_available_resource {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:05:05.051623 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:05:05.052046 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:05:05.052367 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:05:05.052680 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Auditing locally available compute resources for np0035104604 (node: np0035104604) {{(pid=107505) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} Aug 30 14:05:05.053111 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:05:05.730008 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.676s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:05:05.838341 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Aug 30 14:05:05.839366 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Hypervisor/Node resource view: name=np0035104604 free_ram=1940MB free_disk=29.99609375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "00b8", "vendor_id": "1013", "numa_node": null, "label": "label_1013_00b8", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] {{(pid=107505) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} Aug 30 14:05:05.839770 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:05:05.840302 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:05:06.076208 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Total usable vcpus: 8, total allocated vcpus: 0 {{(pid=107505) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} Aug 30 14:05:06.076735 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Final resource view: name=np0035104604 phys_ram=7746MB used_ram=512MB phys_disk=29GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] {{(pid=107505) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} Aug 30 14:05:06.264419 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:05:06.909368 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.645s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:05:06.913949 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:05:06.926415 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:05:06.930182 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Compute_service record updated for np0035104604:np0035104604 {{(pid=107505) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} Aug 30 14:05:06.930482 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.090s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:05:08.930388 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:05:08.930848 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:05:08.931083 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Starting heal instance info cache {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9842}} Aug 30 14:05:08.931367 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Rebuilding the list of instances to heal {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9846}} Aug 30 14:05:08.944959 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Didn't find any instances for network info cache update. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9928}} Aug 30 14:05:08.945511 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:05:08.946359 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:05:08.946726 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:05:08.947120 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:05:08.947464 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=107505) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10461}} Aug 30 14:05:09.027713 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:05:09.028224 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:05:09.028570 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:05:45.288671 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Acquiring lock "a9a74ed7-6eb9-4167-86a3-fe188c08af97" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:05:45.289178 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Lock "a9a74ed7-6eb9-4167-86a3-fe188c08af97" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:05:45.316401 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Starting instance... {{(pid=107505) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} Aug 30 14:05:45.675811 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:05:45.676140 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:05:45.681601 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=107505) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Aug 30 14:05:45.682102 np0035104604 nova-compute[107505]: INFO nova.compute.claims [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Claim successful on node np0035104604 Aug 30 14:05:46.233298 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:05:47.155562 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.922s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:05:47.163083 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:05:47.178528 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:05:47.431292 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.755s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:05:47.432384 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Start building networks asynchronously for instance. {{(pid=107505) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} Aug 30 14:05:47.517778 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Allocating IP information in the background. {{(pid=107505) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} Aug 30 14:05:47.525993 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] allocate_for_instance() {{(pid=107505) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} Aug 30 14:05:47.637562 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Aug 30 14:05:47.773717 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Start building block device mappings for instance. {{(pid=107505) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} Aug 30 14:05:48.031576 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Start spawning the instance on the hypervisor. {{(pid=107505) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} Aug 30 14:05:48.032982 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Creating instance directory {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4717}} Aug 30 14:05:48.033595 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Creating image(s) Aug 30 14:05:48.069391 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] rbd image a9a74ed7-6eb9-4167-86a3-fe188c08af97_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:05:48.120148 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] rbd image a9a74ed7-6eb9-4167-86a3-fe188c08af97_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:05:48.145543 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] rbd image a9a74ed7-6eb9-4167-86a3-fe188c08af97_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:05:48.148671 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Acquiring lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:05:48.150107 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:05:48.578915 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.imagebackend [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Image locations are: [{'url': 'rbd://b94dc20f-5638-48f6-87b0-a600a745ee27/images/a32d892d-d3ba-4b7e-9bab-7e06f730b9e0/snap', 'metadata': {}}] {{(pid=107505) clone /opt/stack/nova/nova/virt/libvirt/imagebackend.py:1070}} Aug 30 14:05:48.730450 np0035104604 nova-compute[107505]: DEBUG nova.policy [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '113a433e1bcc438ea91d3b1b0508f251', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '52df4bef677b4f86bc2cd5ee880b7cc1', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=107505) authorize /opt/stack/nova/nova/policy.py:203}} Aug 30 14:05:49.671116 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e.part --force-share --output=json {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:05:49.755127 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] CMD "/opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e.part --force-share --output=json" returned: 0 in 0.083s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:05:49.756139 np0035104604 nova-compute[107505]: DEBUG nova.virt.images [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] a32d892d-d3ba-4b7e-9bab-7e06f730b9e0 was qcow2, converting to raw {{(pid=107505) fetch_to_raw /opt/stack/nova/nova/virt/images.py:165}} Aug 30 14:05:49.757516 np0035104604 nova-compute[107505]: DEBUG nova.privsep.utils [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=107505) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Aug 30 14:05:49.758185 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e.part /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e.converted {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:05:49.890252 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] CMD "qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e.part /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e.converted" returned: 0 in 0.131s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:05:49.897599 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e.converted --force-share --output=json {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:05:50.081398 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] CMD "/opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e.converted --force-share --output=json" returned: 0 in 0.184s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:05:50.082397 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 1.932s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:05:50.112041 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] rbd image a9a74ed7-6eb9-4167-86a3-fe188c08af97_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:05:50.115199 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e a9a74ed7-6eb9-4167-86a3-fe188c08af97_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:05:50.154507 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Successfully created port: 246226ea-1f45-4c14-95cc-92ee432293be {{(pid=107505) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} Aug 30 14:05:51.234052 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Successfully updated port: 246226ea-1f45-4c14-95cc-92ee432293be {{(pid=107505) _update_port /opt/stack/nova/nova/network/neutron.py:586}} Aug 30 14:05:51.255511 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Acquiring lock "refresh_cache-a9a74ed7-6eb9-4167-86a3-fe188c08af97" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:05:51.255831 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Acquired lock "refresh_cache-a9a74ed7-6eb9-4167-86a3-fe188c08af97" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:05:51.256244 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Building network info cache for instance {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} Aug 30 14:05:51.490115 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Instance cache missing network info. {{(pid=107505) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} Aug 30 14:05:51.717635 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-731ce065-f0c2-443e-b167-72930a23fc72 req-969c6ba8-fa99-4151-9ef8-96da8ec81d32 service nova] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Received event network-changed-246226ea-1f45-4c14-95cc-92ee432293be {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:05:51.718046 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-731ce065-f0c2-443e-b167-72930a23fc72 req-969c6ba8-fa99-4151-9ef8-96da8ec81d32 service nova] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Refreshing instance network info cache due to event network-changed-246226ea-1f45-4c14-95cc-92ee432293be. {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11037}} Aug 30 14:05:51.718540 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-731ce065-f0c2-443e-b167-72930a23fc72 req-969c6ba8-fa99-4151-9ef8-96da8ec81d32 service nova] Acquiring lock "refresh_cache-a9a74ed7-6eb9-4167-86a3-fe188c08af97" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:05:51.894152 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] CMD "rbd import --pool vms /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e a9a74ed7-6eb9-4167-86a3-fe188c08af97_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 1.779s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:05:52.199762 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] resizing rbd image a9a74ed7-6eb9-4167-86a3-fe188c08af97_disk to 1073741824 {{(pid=107505) resize /opt/stack/nova/nova/storage/rbd_utils.py:288}} Aug 30 14:05:52.433085 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Created local disks {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4851}} Aug 30 14:05:52.434324 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Ensure instance console log exists: /opt/stack/data/nova/instances/a9a74ed7-6eb9-4167-86a3-fe188c08af97/console.log {{(pid=107505) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4603}} Aug 30 14:05:52.434922 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:05:52.435418 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:05:52.435862 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:05:52.630805 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Updating instance_info_cache with network_info: [{"id": "246226ea-1f45-4c14-95cc-92ee432293be", "address": "fa:16:3e:c3:43:e6", "network": {"id": "dc6199ed-e883-401b-91de-b1a4e5a0056d", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-1905884620-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "52df4bef677b4f86bc2cd5ee880b7cc1", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap246226ea-1f", "ovs_interfaceid": "246226ea-1f45-4c14-95cc-92ee432293be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:05:52.646935 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Releasing lock "refresh_cache-a9a74ed7-6eb9-4167-86a3-fe188c08af97" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:05:52.647540 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Instance network_info: |[{"id": "246226ea-1f45-4c14-95cc-92ee432293be", "address": "fa:16:3e:c3:43:e6", "network": {"id": "dc6199ed-e883-401b-91de-b1a4e5a0056d", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-1905884620-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "52df4bef677b4f86bc2cd5ee880b7cc1", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap246226ea-1f", "ovs_interfaceid": "246226ea-1f45-4c14-95cc-92ee432293be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=107505) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} Aug 30 14:05:52.648289 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-731ce065-f0c2-443e-b167-72930a23fc72 req-969c6ba8-fa99-4151-9ef8-96da8ec81d32 service nova] Acquired lock "refresh_cache-a9a74ed7-6eb9-4167-86a3-fe188c08af97" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:05:52.648755 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-731ce065-f0c2-443e-b167-72930a23fc72 req-969c6ba8-fa99-4151-9ef8-96da8ec81d32 service nova] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Refreshing network info cache for port 246226ea-1f45-4c14-95cc-92ee432293be {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} Aug 30 14:05:52.655380 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Start _get_guest_xml network_info=[{"id": "246226ea-1f45-4c14-95cc-92ee432293be", "address": "fa:16:3e:c3:43:e6", "network": {"id": "dc6199ed-e883-401b-91de-b1a4e5a0056d", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-1905884620-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "52df4bef677b4f86bc2cd5ee880b7cc1", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap246226ea-1f", "ovs_interfaceid": "246226ea-1f45-4c14-95cc-92ee432293be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'ide', 'dev': 'hda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-08-30T14:01:32Z,direct_url=,disk_format='qcow2',id=a32d892d-d3ba-4b7e-9bab-7e06f730b9e0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='7f1ed4be4a61468fa9c662d2b5fbcb56',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-08-30T14:01:43Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'disk_bus': 'virtio', 'image_id': 'a32d892d-d3ba-4b7e-9bab-7e06f730b9e0'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=107505) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7553}} Aug 30 14:05:52.661258 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Aug 30 14:05:52.786795 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Searching host: 'np0035104604' for CPU controller through CGroups V1... {{(pid=107505) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1650}} Aug 30 14:05:52.787520 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] CPU controller missing on host. {{(pid=107505) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1660}} Aug 30 14:05:52.789754 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Searching host: 'np0035104604' for CPU controller through CGroups V2... {{(pid=107505) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1669}} Aug 30 14:05:52.792329 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] CPU controller found on host. {{(pid=107505) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1676}} Aug 30 14:05:52.792329 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=107505) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5390}} Aug 30 14:05:52.792824 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-08-30T14:02:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-08-30T14:01:32Z,direct_url=,disk_format='qcow2',id=a32d892d-d3ba-4b7e-9bab-7e06f730b9e0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='7f1ed4be4a61468fa9c662d2b5fbcb56',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-08-30T14:01:43Z,virtual_size=,visibility=), allow threads: True {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Aug 30 14:05:52.793220 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Flavor limits 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Aug 30 14:05:52.793575 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Image limits 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Aug 30 14:05:52.793966 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Flavor pref 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Aug 30 14:05:52.794301 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Image pref 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Aug 30 14:05:52.794647 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Aug 30 14:05:52.795099 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Aug 30 14:05:52.795528 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=107505) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Aug 30 14:05:52.796824 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Got 1 possible topologies {{(pid=107505) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Aug 30 14:05:52.797306 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Aug 30 14:05:52.797799 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Aug 30 14:05:52.825591 np0035104604 nova-compute[107505]: DEBUG nova.privsep.utils [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=107505) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Aug 30 14:05:52.826672 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:05:53.738217 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.911s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:05:53.798576 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] rbd image a9a74ed7-6eb9-4167-86a3-fe188c08af97_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:05:53.803735 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:05:54.306063 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-731ce065-f0c2-443e-b167-72930a23fc72 req-969c6ba8-fa99-4151-9ef8-96da8ec81d32 service nova] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Updated VIF entry in instance network info cache for port 246226ea-1f45-4c14-95cc-92ee432293be. {{(pid=107505) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} Aug 30 14:05:54.307053 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-731ce065-f0c2-443e-b167-72930a23fc72 req-969c6ba8-fa99-4151-9ef8-96da8ec81d32 service nova] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Updating instance_info_cache with network_info: [{"id": "246226ea-1f45-4c14-95cc-92ee432293be", "address": "fa:16:3e:c3:43:e6", "network": {"id": "dc6199ed-e883-401b-91de-b1a4e5a0056d", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-1905884620-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "52df4bef677b4f86bc2cd5ee880b7cc1", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap246226ea-1f", "ovs_interfaceid": "246226ea-1f45-4c14-95cc-92ee432293be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:05:54.323808 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-731ce065-f0c2-443e-b167-72930a23fc72 req-969c6ba8-fa99-4151-9ef8-96da8ec81d32 service nova] Releasing lock "refresh_cache-a9a74ed7-6eb9-4167-86a3-fe188c08af97" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:05:54.556268 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.752s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:05:54.560498 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-08-30T14:05:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAssistedSnapshotsTest-server-1478890135',display_name='tempest-VolumesAssistedSnapshotsTest-server-1478890135',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-volumesassistedsnapshotstest-server-1478890135',id=1,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFqC8uh2+YoNrXptc5ixgT/phgGQy0l9+/1kPf9/2AAS72msw7v6bohpg0KzDaAlfBGjXfBxccFhilsQDIM2reOBWA+UPM9yHEzoOy5G3dUW1uITle7QYHXulfuGIetqcQ==',key_name='tempest-keypair-649641997',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='52df4bef677b4f86bc2cd5ee880b7cc1',ramdisk_id='',reservation_id='r-ooxvtgat',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesAssistedSnapshotsTest-74693406',owner_user_name='tempest-VolumesAssistedSnapshotsTest-74693406-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-08-30T14:05:48Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='113a433e1bcc438ea91d3b1b0508f251',uuid=a9a74ed7-6eb9-4167-86a3-fe188c08af97,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "246226ea-1f45-4c14-95cc-92ee432293be", "address": "fa:16:3e:c3:43:e6", "network": {"id": "dc6199ed-e883-401b-91de-b1a4e5a0056d", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-1905884620-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "52df4bef677b4f86bc2cd5ee880b7cc1", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap246226ea-1f", "ovs_interfaceid": "246226ea-1f45-4c14-95cc-92ee432293be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=qemu {{(pid=107505) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Aug 30 14:05:54.561872 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Converting VIF {"id": "246226ea-1f45-4c14-95cc-92ee432293be", "address": "fa:16:3e:c3:43:e6", "network": {"id": "dc6199ed-e883-401b-91de-b1a4e5a0056d", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-1905884620-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "52df4bef677b4f86bc2cd5ee880b7cc1", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap246226ea-1f", "ovs_interfaceid": "246226ea-1f45-4c14-95cc-92ee432293be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:05:54.566028 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c3:43:e6,bridge_name='br-int',has_traffic_filtering=True,id=246226ea-1f45-4c14-95cc-92ee432293be,network=Network(dc6199ed-e883-401b-91de-b1a4e5a0056d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap246226ea-1f') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:05:54.569981 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Lazy-loading 'pci_devices' on Instance uuid a9a74ed7-6eb9-4167-86a3-fe188c08af97 {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] End _get_guest_xml xml= Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: a9a74ed7-6eb9-4167-86a3-fe188c08af97 Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: instance-00000001 Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: 131072 Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: 1 Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: tempest-VolumesAssistedSnapshotsTest-server-1478890135 Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: 2023-08-30 14:05:52 Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: 128 Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: 1 Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: 0 Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: 0 Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: 1 Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: tempest-VolumesAssistedSnapshotsTest-74693406-project-member Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: tempest-VolumesAssistedSnapshotsTest-74693406 Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: OpenStack Foundation Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: OpenStack Nova Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: 27.1.0 Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: a9a74ed7-6eb9-4167-86a3-fe188c08af97 Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: a9a74ed7-6eb9-4167-86a3-fe188c08af97 Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Virtual Machine Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: hvm Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Nehalem Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: /dev/urandom Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: Aug 30 14:05:54.591591 np0035104604 nova-compute[107505]: {{(pid=107505) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7559}} Aug 30 14:05:54.602714 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Preparing to wait for external event network-vif-plugged-246226ea-1f45-4c14-95cc-92ee432293be {{(pid=107505) prepare_for_instance_event /opt/stack/nova/nova/compute/manager.py:283}} Aug 30 14:05:54.602714 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Acquiring lock "a9a74ed7-6eb9-4167-86a3-fe188c08af97-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:05:54.602714 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Lock "a9a74ed7-6eb9-4167-86a3-fe188c08af97-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:05:54.602714 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Lock "a9a74ed7-6eb9-4167-86a3-fe188c08af97-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:05:54.602714 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-08-30T14:05:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAssistedSnapshotsTest-server-1478890135',display_name='tempest-VolumesAssistedSnapshotsTest-server-1478890135',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-volumesassistedsnapshotstest-server-1478890135',id=1,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFqC8uh2+YoNrXptc5ixgT/phgGQy0l9+/1kPf9/2AAS72msw7v6bohpg0KzDaAlfBGjXfBxccFhilsQDIM2reOBWA+UPM9yHEzoOy5G3dUW1uITle7QYHXulfuGIetqcQ==',key_name='tempest-keypair-649641997',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='52df4bef677b4f86bc2cd5ee880b7cc1',ramdisk_id='',reservation_id='r-ooxvtgat',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesAssistedSnapshotsTest-74693406',owner_user_name='tempest-VolumesAssistedSnapshotsTest-74693406-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-08-30T14:05:48Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='113a433e1bcc438ea91d3b1b0508f251',uuid=a9a74ed7-6eb9-4167-86a3-fe188c08af97,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "246226ea-1f45-4c14-95cc-92ee432293be", "address": "fa:16:3e:c3:43:e6", "network": {"id": "dc6199ed-e883-401b-91de-b1a4e5a0056d", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-1905884620-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "52df4bef677b4f86bc2cd5ee880b7cc1", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap246226ea-1f", "ovs_interfaceid": "246226ea-1f45-4c14-95cc-92ee432293be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Aug 30 14:05:54.604117 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Converting VIF {"id": "246226ea-1f45-4c14-95cc-92ee432293be", "address": "fa:16:3e:c3:43:e6", "network": {"id": "dc6199ed-e883-401b-91de-b1a4e5a0056d", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-1905884620-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "52df4bef677b4f86bc2cd5ee880b7cc1", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap246226ea-1f", "ovs_interfaceid": "246226ea-1f45-4c14-95cc-92ee432293be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:05:54.604117 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c3:43:e6,bridge_name='br-int',has_traffic_filtering=True,id=246226ea-1f45-4c14-95cc-92ee432293be,network=Network(dc6199ed-e883-401b-91de-b1a4e5a0056d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap246226ea-1f') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:05:54.604117 np0035104604 nova-compute[107505]: DEBUG os_vif [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c3:43:e6,bridge_name='br-int',has_traffic_filtering=True,id=246226ea-1f45-4c14-95cc-92ee432293be,network=Network(dc6199ed-e883-401b-91de-b1a4e5a0056d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap246226ea-1f') {{(pid=107505) plug /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:76}} Aug 30 14:05:54.698943 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Created schema index Interface.name {{(pid=107505) autocreate_indices /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106}} Aug 30 14:05:54.699366 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Created schema index Port.name {{(pid=107505) autocreate_indices /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106}} Aug 30 14:05:54.699720 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Created schema index Bridge.name {{(pid=107505) autocreate_indices /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106}} Aug 30 14:05:54.700441 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] tcp:127.0.0.1:6640: entering CONNECTING {{(pid=107505) _transition /opt/stack/data/venv/lib/python3.10/site-packages/ovs/reconnect.py:519}} Aug 30 14:05:54.701213 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] [POLLOUT] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:05:54.701501 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=107505) _transition /opt/stack/data/venv/lib/python3.10/site-packages/ovs/reconnect.py:519}} Aug 30 14:05:54.702282 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:05:54.704537 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:05:54.746618 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:05:54.762806 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:05:54.763273 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:05:54.763720 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Aug 30 14:05:54.770317 np0035104604 nova-compute[107505]: INFO oslo.privsep.daemon [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova-cpu.conf', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpn5shpt8g/privsep.sock'] Aug 30 14:05:54.784946 np0035104604 sudo[118207]: stack : PWD=/ ; USER=root ; COMMAND=/opt/stack/data/venv/bin/nova-rootwrap /etc/nova/rootwrap.conf privsep-helper --config-file /etc/nova/nova-cpu.conf --privsep_context vif_plug_ovs.privsep.vif_plug --privsep_sock_path /tmp/tmpn5shpt8g/privsep.sock Aug 30 14:05:54.785750 np0035104604 sudo[118207]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1001) Aug 30 14:05:56.505560 np0035104604 sudo[118207]: pam_unix(sudo:session): session closed for user root Aug 30 14:05:56.516483 np0035104604 nova-compute[107505]: INFO oslo.privsep.daemon [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Spawned new privsep daemon via rootwrap Aug 30 14:05:56.517856 np0035104604 nova-compute[107505]: INFO oslo.privsep.daemon [-] privsep daemon starting Aug 30 14:05:56.519648 np0035104604 nova-compute[107505]: INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0 Aug 30 14:05:56.520111 np0035104604 nova-compute[107505]: INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none Aug 30 14:05:56.520440 np0035104604 nova-compute[107505]: INFO oslo.privsep.daemon [-] privsep daemon running as pid 118224 Aug 30 14:05:57.191043 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:05:57.191715 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap246226ea-1f, may_exist=True, interface_attrs={}) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:05:57.192764 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap246226ea-1f, col_values=(('external_ids', {'iface-id': '246226ea-1f45-4c14-95cc-92ee432293be', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c3:43:e6', 'vm-uuid': 'a9a74ed7-6eb9-4167-86a3-fe188c08af97'}),), if_exists=True) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:05:57.194821 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:05:57.200188 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:05:57.204260 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:05:57.207085 np0035104604 nova-compute[107505]: INFO os_vif [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c3:43:e6,bridge_name='br-int',has_traffic_filtering=True,id=246226ea-1f45-4c14-95cc-92ee432293be,network=Network(dc6199ed-e883-401b-91de-b1a4e5a0056d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap246226ea-1f') Aug 30 14:05:57.259375 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] No BDM found with device name vda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:05:57.260081 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] No BDM found with device name hda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:05:57.260548 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] No VIF found with MAC fa:16:3e:c3:43:e6, not building metadata {{(pid=107505) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12068}} Aug 30 14:05:57.261597 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Using config drive Aug 30 14:05:57.304899 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] rbd image a9a74ed7-6eb9-4167-86a3-fe188c08af97_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:05:57.836068 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Creating config drive at /opt/stack/data/nova/instances/a9a74ed7-6eb9-4167-86a3-fe188c08af97/disk.config Aug 30 14:05:57.841602 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Running cmd (subprocess): genisoimage -o /opt/stack/data/nova/instances/a9a74ed7-6eb9-4167-86a3-fe188c08af97/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 27.1.0 -quiet -J -r -V config-2 /tmp/tmph48xqrxw {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:05:57.870053 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] CMD "genisoimage -o /opt/stack/data/nova/instances/a9a74ed7-6eb9-4167-86a3-fe188c08af97/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 27.1.0 -quiet -J -r -V config-2 /tmp/tmph48xqrxw" returned: 0 in 0.028s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:05:57.897664 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] rbd image a9a74ed7-6eb9-4167-86a3-fe188c08af97_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:05:57.901890 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/a9a74ed7-6eb9-4167-86a3-fe188c08af97/disk.config a9a74ed7-6eb9-4167-86a3-fe188c08af97_disk.config --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:05:58.049639 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] CMD "rbd import --pool vms /opt/stack/data/nova/instances/a9a74ed7-6eb9-4167-86a3-fe188c08af97/disk.config a9a74ed7-6eb9-4167-86a3-fe188c08af97_disk.config --image-format=2 --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.147s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:05:58.050365 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Deleting local config drive /opt/stack/data/nova/instances/a9a74ed7-6eb9-4167-86a3-fe188c08af97/disk.config because it was imported into RBD. Aug 30 14:05:58.089813 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:05:58.099547 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:05:58.624234 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:05:58.836897 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-9d5eedcb-dabf-4807-a55c-8a95a505d4b1 req-3d59461f-38aa-419b-944d-df8a965484e9 service nova] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Received event network-vif-plugged-246226ea-1f45-4c14-95cc-92ee432293be {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:05:58.837546 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-9d5eedcb-dabf-4807-a55c-8a95a505d4b1 req-3d59461f-38aa-419b-944d-df8a965484e9 service nova] Acquiring lock "a9a74ed7-6eb9-4167-86a3-fe188c08af97-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:05:58.838248 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-9d5eedcb-dabf-4807-a55c-8a95a505d4b1 req-3d59461f-38aa-419b-944d-df8a965484e9 service nova] Lock "a9a74ed7-6eb9-4167-86a3-fe188c08af97-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:05:58.838248 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-9d5eedcb-dabf-4807-a55c-8a95a505d4b1 req-3d59461f-38aa-419b-944d-df8a965484e9 service nova] Lock "a9a74ed7-6eb9-4167-86a3-fe188c08af97-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:05:58.838616 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-9d5eedcb-dabf-4807-a55c-8a95a505d4b1 req-3d59461f-38aa-419b-944d-df8a965484e9 service nova] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Processing event network-vif-plugged-246226ea-1f45-4c14-95cc-92ee432293be {{(pid=107505) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10792}} Aug 30 14:05:59.100855 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Started> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:05:59.102004 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] VM Started (Lifecycle Event) Aug 30 14:05:59.105116 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Instance event wait completed in 0 seconds for network-vif-plugged {{(pid=107505) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Aug 30 14:05:59.108822 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Guest created on hypervisor {{(pid=107505) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4411}} Aug 30 14:05:59.111838 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Instance spawned successfully. Aug 30 14:05:59.112436 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:894}} Aug 30 14:05:59.141265 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Found default for hw_cdrom_bus of ide {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:05:59.145329 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Found default for hw_disk_bus of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:05:59.146114 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Found default for hw_input_bus of None {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:05:59.146824 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Found default for hw_pointer_model of None {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:05:59.147385 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Found default for hw_video_model of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:05:59.149564 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Found default for hw_vif_model of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:05:59.198973 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:05:59.204313 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:05:59.256290 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] During sync_power_state the instance has a pending task (spawning). Skip. Aug 30 14:05:59.257001 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Paused> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:05:59.258249 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] VM Paused (Lifecycle Event) Aug 30 14:05:59.270755 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Took 11.24 seconds to spawn the instance on the hypervisor. Aug 30 14:05:59.271915 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:05:59.280495 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:05:59.298139 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Resumed> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:05:59.298711 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] VM Resumed (Lifecycle Event) Aug 30 14:05:59.329121 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:05:59.337556 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:05:59.388599 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Took 13.93 seconds to build instance. Aug 30 14:05:59.427215 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-85c398e9-56e8-44d5-a119-c2de0a25b9c6 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Lock "a9a74ed7-6eb9-4167-86a3-fe188c08af97" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 14.138s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:06:00.911949 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-6b8b1e03-ab87-4947-afaf-7a091fa8eb91 req-853996ad-d0b6-4b7d-a17c-9cdb43dd2c83 service nova] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Received event network-vif-plugged-246226ea-1f45-4c14-95cc-92ee432293be {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:06:00.912661 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-6b8b1e03-ab87-4947-afaf-7a091fa8eb91 req-853996ad-d0b6-4b7d-a17c-9cdb43dd2c83 service nova] Acquiring lock "a9a74ed7-6eb9-4167-86a3-fe188c08af97-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:06:00.914380 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-6b8b1e03-ab87-4947-afaf-7a091fa8eb91 req-853996ad-d0b6-4b7d-a17c-9cdb43dd2c83 service nova] Lock "a9a74ed7-6eb9-4167-86a3-fe188c08af97-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:06:00.914740 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-6b8b1e03-ab87-4947-afaf-7a091fa8eb91 req-853996ad-d0b6-4b7d-a17c-9cdb43dd2c83 service nova] Lock "a9a74ed7-6eb9-4167-86a3-fe188c08af97-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:06:00.915394 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-6b8b1e03-ab87-4947-afaf-7a091fa8eb91 req-853996ad-d0b6-4b7d-a17c-9cdb43dd2c83 service nova] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] No waiting events found dispatching network-vif-plugged-246226ea-1f45-4c14-95cc-92ee432293be {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:06:00.916133 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [req-6b8b1e03-ab87-4947-afaf-7a091fa8eb91 req-853996ad-d0b6-4b7d-a17c-9cdb43dd2c83 service nova] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Received unexpected event network-vif-plugged-246226ea-1f45-4c14-95cc-92ee432293be for instance with vm_state active and task_state None. Aug 30 14:06:01.879414 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-3f0dc3d6-be7a-4eb7-8542-357790eb2e2e req-79b771cc-c91e-4ef3-85d6-11e7a9237af4 service nova] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Received event network-changed-246226ea-1f45-4c14-95cc-92ee432293be {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:06:01.879843 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-3f0dc3d6-be7a-4eb7-8542-357790eb2e2e req-79b771cc-c91e-4ef3-85d6-11e7a9237af4 service nova] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Refreshing instance network info cache due to event network-changed-246226ea-1f45-4c14-95cc-92ee432293be. {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11037}} Aug 30 14:06:01.880365 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-3f0dc3d6-be7a-4eb7-8542-357790eb2e2e req-79b771cc-c91e-4ef3-85d6-11e7a9237af4 service nova] Acquiring lock "refresh_cache-a9a74ed7-6eb9-4167-86a3-fe188c08af97" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:06:01.880776 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-3f0dc3d6-be7a-4eb7-8542-357790eb2e2e req-79b771cc-c91e-4ef3-85d6-11e7a9237af4 service nova] Acquired lock "refresh_cache-a9a74ed7-6eb9-4167-86a3-fe188c08af97" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:06:01.881337 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-3f0dc3d6-be7a-4eb7-8542-357790eb2e2e req-79b771cc-c91e-4ef3-85d6-11e7a9237af4 service nova] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Refreshing network info cache for port 246226ea-1f45-4c14-95cc-92ee432293be {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} Aug 30 14:06:02.253753 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:03.338050 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:03.343279 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:03.355744 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-3f0dc3d6-be7a-4eb7-8542-357790eb2e2e req-79b771cc-c91e-4ef3-85d6-11e7a9237af4 service nova] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Updated VIF entry in instance network info cache for port 246226ea-1f45-4c14-95cc-92ee432293be. {{(pid=107505) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} Aug 30 14:06:03.355847 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-3f0dc3d6-be7a-4eb7-8542-357790eb2e2e req-79b771cc-c91e-4ef3-85d6-11e7a9237af4 service nova] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Updating instance_info_cache with network_info: [{"id": "246226ea-1f45-4c14-95cc-92ee432293be", "address": "fa:16:3e:c3:43:e6", "network": {"id": "dc6199ed-e883-401b-91de-b1a4e5a0056d", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-1905884620-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.5.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "52df4bef677b4f86bc2cd5ee880b7cc1", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap246226ea-1f", "ovs_interfaceid": "246226ea-1f45-4c14-95cc-92ee432293be", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:06:03.358734 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:03.372818 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-3f0dc3d6-be7a-4eb7-8542-357790eb2e2e req-79b771cc-c91e-4ef3-85d6-11e7a9237af4 service nova] Releasing lock "refresh_cache-a9a74ed7-6eb9-4167-86a3-fe188c08af97" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:06:03.626222 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:05.026241 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager.update_available_resource {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:06:05.053715 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:06:05.054716 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:06:05.056902 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:06:05.057867 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Auditing locally available compute resources for np0035104604 (node: np0035104604) {{(pid=107505) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} Aug 30 14:06:05.058299 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:06:05.932601 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.874s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:06:06.007197 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000001 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:06:06.007409 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000001 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:06:06.083220 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Aug 30 14:06:06.086665 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Hypervisor/Node resource view: name=np0035104604 free_ram=1267MB free_disk=29.975162506103516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "00b8", "vendor_id": "1013", "numa_node": null, "label": "label_1013_00b8", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] {{(pid=107505) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} Aug 30 14:06:06.087082 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:06:06.087608 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:06:06.280585 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance a9a74ed7-6eb9-4167-86a3-fe188c08af97 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:06:06.280919 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Total usable vcpus: 8, total allocated vcpus: 1 {{(pid=107505) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} Aug 30 14:06:06.281223 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Final resource view: name=np0035104604 phys_ram=7746MB used_ram=640MB phys_disk=29GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] {{(pid=107505) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} Aug 30 14:06:06.592591 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:06:07.258115 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:07.325485 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.732s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:06:07.333807 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:06:07.358056 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:06:07.405959 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Compute_service record updated for np0035104604:np0035104604 {{(pid=107505) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} Aug 30 14:06:07.406454 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.319s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:06:07.407959 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:06:07.408645 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Cleaning up deleted instances {{(pid=107505) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11129}} Aug 30 14:06:07.437888 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] There are 0 instances to clean {{(pid=107505) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11138}} Aug 30 14:06:07.438871 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:06:07.439469 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Cleaning up deleted instances with incomplete migration {{(pid=107505) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11167}} Aug 30 14:06:07.461664 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:06:08.630202 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:10.270731 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Acquiring lock "321b765f-5338-445c-a408-d5ee207af425" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:06:10.271354 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Lock "321b765f-5338-445c-a408-d5ee207af425" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:06:10.295831 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] [instance: 321b765f-5338-445c-a408-d5ee207af425] Starting instance... {{(pid=107505) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} Aug 30 14:06:10.492265 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:06:10.492663 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:06:10.493035 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Starting heal instance info cache {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9842}} Aug 30 14:06:10.493135 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Rebuilding the list of instances to heal {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9846}} Aug 30 14:06:10.628194 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:06:10.628194 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:06:10.636217 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=107505) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Aug 30 14:06:10.636217 np0035104604 nova-compute[107505]: INFO nova.compute.claims [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] [instance: 321b765f-5338-445c-a408-d5ee207af425] Claim successful on node np0035104604 Aug 30 14:06:10.740032 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "refresh_cache-a9a74ed7-6eb9-4167-86a3-fe188c08af97" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:06:10.740395 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquired lock "refresh_cache-a9a74ed7-6eb9-4167-86a3-fe188c08af97" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:06:10.740732 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Forcefully refreshing network info cache for instance {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1996}} Aug 30 14:06:10.741056 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lazy-loading 'info_cache' on Instance uuid a9a74ed7-6eb9-4167-86a3-fe188c08af97 {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:06:11.406270 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:06:12.232006 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.825s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:06:12.243316 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:06:12.261226 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:06:12.271747 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:12.421833 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.794s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:06:12.422930 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] [instance: 321b765f-5338-445c-a408-d5ee207af425] Start building networks asynchronously for instance. {{(pid=107505) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} Aug 30 14:06:12.494523 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] [instance: 321b765f-5338-445c-a408-d5ee207af425] Not allocating networking since 'none' was specified. {{(pid=107505) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} Aug 30 14:06:12.514286 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] [instance: 321b765f-5338-445c-a408-d5ee207af425] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Aug 30 14:06:12.648027 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] [instance: 321b765f-5338-445c-a408-d5ee207af425] Start building block device mappings for instance. {{(pid=107505) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} Aug 30 14:06:12.740434 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Updating instance_info_cache with network_info: [{"id": "246226ea-1f45-4c14-95cc-92ee432293be", "address": "fa:16:3e:c3:43:e6", "network": {"id": "dc6199ed-e883-401b-91de-b1a4e5a0056d", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-1905884620-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.5.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "52df4bef677b4f86bc2cd5ee880b7cc1", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap246226ea-1f", "ovs_interfaceid": "246226ea-1f45-4c14-95cc-92ee432293be", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:06:12.766952 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Releasing lock "refresh_cache-a9a74ed7-6eb9-4167-86a3-fe188c08af97" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:06:12.767482 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Updated the network info_cache for instance {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} Aug 30 14:06:12.768622 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:06:12.769015 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:06:12.769566 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:06:12.770271 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:06:12.771079 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:06:12.771559 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:06:12.772104 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=107505) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10461}} Aug 30 14:06:12.802603 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] [instance: 321b765f-5338-445c-a408-d5ee207af425] Start spawning the instance on the hypervisor. {{(pid=107505) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} Aug 30 14:06:12.804036 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] [instance: 321b765f-5338-445c-a408-d5ee207af425] Creating instance directory {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4717}} Aug 30 14:06:12.804622 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] [instance: 321b765f-5338-445c-a408-d5ee207af425] Creating image(s) Aug 30 14:06:12.834948 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] rbd image 321b765f-5338-445c-a408-d5ee207af425_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:06:12.865218 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] rbd image 321b765f-5338-445c-a408-d5ee207af425_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:06:12.895502 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] rbd image 321b765f-5338-445c-a408-d5ee207af425_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:06:12.899703 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e --force-share --output=json {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:06:12.989111 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] CMD "/opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e --force-share --output=json" returned: 0 in 0.089s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:06:12.990377 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Acquiring lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:06:12.991889 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.002s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:06:12.992805 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:06:13.036836 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] rbd image 321b765f-5338-445c-a408-d5ee207af425_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:06:13.040706 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e 321b765f-5338-445c-a408-d5ee207af425_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:06:13.392049 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] CMD "rbd import --pool vms /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e 321b765f-5338-445c-a408-d5ee207af425_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.350s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:06:13.475762 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] resizing rbd image 321b765f-5338-445c-a408-d5ee207af425_disk to 1073741824 {{(pid=107505) resize /opt/stack/nova/nova/storage/rbd_utils.py:288}} Aug 30 14:06:13.573531 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] [instance: 321b765f-5338-445c-a408-d5ee207af425] Created local disks {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4851}} Aug 30 14:06:13.574504 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] [instance: 321b765f-5338-445c-a408-d5ee207af425] Ensure instance console log exists: /opt/stack/data/nova/instances/321b765f-5338-445c-a408-d5ee207af425/console.log {{(pid=107505) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4603}} Aug 30 14:06:13.575260 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:06:13.576142 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:06:13.576761 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:06:13.579797 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] [instance: 321b765f-5338-445c-a408-d5ee207af425] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'ide', 'dev': 'hda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-08-30T14:01:32Z,direct_url=,disk_format='qcow2',id=a32d892d-d3ba-4b7e-9bab-7e06f730b9e0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='7f1ed4be4a61468fa9c662d2b5fbcb56',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-08-30T14:01:43Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'disk_bus': 'virtio', 'image_id': 'a32d892d-d3ba-4b7e-9bab-7e06f730b9e0'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=107505) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7553}} Aug 30 14:06:13.584577 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Aug 30 14:06:13.587240 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Searching host: 'np0035104604' for CPU controller through CGroups V1... {{(pid=107505) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1650}} Aug 30 14:06:13.587794 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] CPU controller missing on host. {{(pid=107505) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1660}} Aug 30 14:06:13.589374 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Searching host: 'np0035104604' for CPU controller through CGroups V2... {{(pid=107505) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1669}} Aug 30 14:06:13.589895 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] CPU controller found on host. {{(pid=107505) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1676}} Aug 30 14:06:13.591333 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=107505) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5390}} Aug 30 14:06:13.592368 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-08-30T14:02:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-08-30T14:01:32Z,direct_url=,disk_format='qcow2',id=a32d892d-d3ba-4b7e-9bab-7e06f730b9e0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='7f1ed4be4a61468fa9c662d2b5fbcb56',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-08-30T14:01:43Z,virtual_size=,visibility=), allow threads: True {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Aug 30 14:06:13.592368 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Flavor limits 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Aug 30 14:06:13.592710 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Image limits 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Aug 30 14:06:13.592907 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Flavor pref 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Aug 30 14:06:13.593208 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Image pref 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Aug 30 14:06:13.593475 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Aug 30 14:06:13.593828 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Aug 30 14:06:13.594166 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=107505) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Aug 30 14:06:13.594441 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Got 1 possible topologies {{(pid=107505) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Aug 30 14:06:13.594717 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Aug 30 14:06:13.595041 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Aug 30 14:06:13.622452 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:06:13.629888 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:14.232767 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.622s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:06:14.267824 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] rbd image 321b765f-5338-445c-a408-d5ee207af425_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:06:14.271109 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:06:14.848768 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.577s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:06:14.850831 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Lazy-loading 'pci_devices' on Instance uuid 321b765f-5338-445c-a408-d5ee207af425 {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] [instance: 321b765f-5338-445c-a408-d5ee207af425] End _get_guest_xml xml= Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: 321b765f-5338-445c-a408-d5ee207af425 Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: instance-00000002 Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: 131072 Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: 1 Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: tempest-ServerDiagnosticsV248Test-server-550160329 Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: 2023-08-30 14:06:13 Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: 128 Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: 1 Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: 0 Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: 0 Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: 1 Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: tempest-ServerDiagnosticsV248Test-334985725-project-member Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: tempest-ServerDiagnosticsV248Test-334985725 Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: OpenStack Foundation Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: OpenStack Nova Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: 27.1.0 Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: 321b765f-5338-445c-a408-d5ee207af425 Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: 321b765f-5338-445c-a408-d5ee207af425 Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Virtual Machine Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: hvm Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Nehalem Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: /dev/urandom Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: Aug 30 14:06:14.867817 np0035104604 nova-compute[107505]: {{(pid=107505) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7559}} Aug 30 14:06:14.913371 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] No BDM found with device name vda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:06:14.914129 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] No BDM found with device name hda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:06:14.914766 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] [instance: 321b765f-5338-445c-a408-d5ee207af425] Using config drive Aug 30 14:06:14.942666 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] rbd image 321b765f-5338-445c-a408-d5ee207af425_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:06:15.132994 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] [instance: 321b765f-5338-445c-a408-d5ee207af425] Creating config drive at /opt/stack/data/nova/instances/321b765f-5338-445c-a408-d5ee207af425/disk.config Aug 30 14:06:15.138672 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Running cmd (subprocess): genisoimage -o /opt/stack/data/nova/instances/321b765f-5338-445c-a408-d5ee207af425/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 27.1.0 -quiet -J -r -V config-2 /tmp/tmp_rutlqh6 {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:06:15.174025 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] CMD "genisoimage -o /opt/stack/data/nova/instances/321b765f-5338-445c-a408-d5ee207af425/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 27.1.0 -quiet -J -r -V config-2 /tmp/tmp_rutlqh6" returned: 0 in 0.035s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:06:15.206110 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] rbd image 321b765f-5338-445c-a408-d5ee207af425_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:06:15.209326 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/321b765f-5338-445c-a408-d5ee207af425/disk.config 321b765f-5338-445c-a408-d5ee207af425_disk.config --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:06:15.350359 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] CMD "rbd import --pool vms /opt/stack/data/nova/instances/321b765f-5338-445c-a408-d5ee207af425/disk.config 321b765f-5338-445c-a408-d5ee207af425_disk.config --image-format=2 --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.140s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:06:15.351164 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] [instance: 321b765f-5338-445c-a408-d5ee207af425] Deleting local config drive /opt/stack/data/nova/instances/321b765f-5338-445c-a408-d5ee207af425/disk.config because it was imported into RBD. Aug 30 14:06:16.242868 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] [instance: 321b765f-5338-445c-a408-d5ee207af425] Instance event wait completed in 0 seconds for {{(pid=107505) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Aug 30 14:06:16.243313 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] [instance: 321b765f-5338-445c-a408-d5ee207af425] Guest created on hypervisor {{(pid=107505) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4411}} Aug 30 14:06:16.243859 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Resumed> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:06:16.244385 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 321b765f-5338-445c-a408-d5ee207af425] VM Resumed (Lifecycle Event) Aug 30 14:06:16.251285 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: 321b765f-5338-445c-a408-d5ee207af425] Instance spawned successfully. Aug 30 14:06:16.251835 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] [instance: 321b765f-5338-445c-a408-d5ee207af425] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:894}} Aug 30 14:06:16.276740 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 321b765f-5338-445c-a408-d5ee207af425] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:06:16.282810 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] [instance: 321b765f-5338-445c-a408-d5ee207af425] Found default for hw_cdrom_bus of ide {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:06:16.283278 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] [instance: 321b765f-5338-445c-a408-d5ee207af425] Found default for hw_disk_bus of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:06:16.284163 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] [instance: 321b765f-5338-445c-a408-d5ee207af425] Found default for hw_input_bus of None {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:06:16.284914 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] [instance: 321b765f-5338-445c-a408-d5ee207af425] Found default for hw_pointer_model of None {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:06:16.285851 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] [instance: 321b765f-5338-445c-a408-d5ee207af425] Found default for hw_video_model of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:06:16.286730 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] [instance: 321b765f-5338-445c-a408-d5ee207af425] Found default for hw_vif_model of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:06:16.294843 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 321b765f-5338-445c-a408-d5ee207af425] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:06:16.319917 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 321b765f-5338-445c-a408-d5ee207af425] During sync_power_state the instance has a pending task (spawning). Skip. Aug 30 14:06:16.320263 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Started> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:06:16.320489 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 321b765f-5338-445c-a408-d5ee207af425] VM Started (Lifecycle Event) Aug 30 14:06:16.344415 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 321b765f-5338-445c-a408-d5ee207af425] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:06:16.348624 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 321b765f-5338-445c-a408-d5ee207af425] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:06:16.370062 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 321b765f-5338-445c-a408-d5ee207af425] During sync_power_state the instance has a pending task (spawning). Skip. Aug 30 14:06:16.389803 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] [instance: 321b765f-5338-445c-a408-d5ee207af425] Took 3.59 seconds to spawn the instance on the hypervisor. Aug 30 14:06:16.390158 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] [instance: 321b765f-5338-445c-a408-d5ee207af425] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:06:16.469416 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] [instance: 321b765f-5338-445c-a408-d5ee207af425] Took 6.09 seconds to build instance. Aug 30 14:06:16.489229 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-7a72a514-0308-4a10-b517-0ef3d627c1ed tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Lock "321b765f-5338-445c-a408-d5ee207af425" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.218s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:06:17.284881 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:17.411586 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-4c8aad12-f7cc-48f0-b46b-8d0a6caebe6d tempest-ServerDiagnosticsV248Test-1735640900 tempest-ServerDiagnosticsV248Test-1735640900-project-admin] [instance: 321b765f-5338-445c-a408-d5ee207af425] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:06:17.415679 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-4c8aad12-f7cc-48f0-b46b-8d0a6caebe6d tempest-ServerDiagnosticsV248Test-1735640900 tempest-ServerDiagnosticsV248Test-1735640900-project-admin] [instance: 321b765f-5338-445c-a408-d5ee207af425] Retrieving diagnostics Aug 30 14:06:18.366279 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:18.631627 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:22.287025 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:24.313263 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Acquiring lock "ad833d26-ac53-4df9-aaaf-daa2928d67f3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:06:24.316679 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Lock "ad833d26-ac53-4df9-aaaf-daa2928d67f3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.004s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:06:24.339764 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Starting instance... {{(pid=107505) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} Aug 30 14:06:24.778302 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:06:24.778876 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:06:24.784111 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=107505) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Aug 30 14:06:24.784470 np0035104604 nova-compute[107505]: INFO nova.compute.claims [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Claim successful on node np0035104604 Aug 30 14:06:25.061655 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Refreshing inventories for resource provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} Aug 30 14:06:25.226890 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Updating ProviderTree inventory for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} Aug 30 14:06:25.227344 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Updating inventory in ProviderTree for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Aug 30 14:06:25.361345 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Refreshing aggregate associations for resource provider 600ab55f-530c-4be6-bf02-067d68ce7ee4, aggregates: None {{(pid=107505) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} Aug 30 14:06:25.558292 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Refreshing trait associations for resource provider 600ab55f-530c-4be6-bf02-067d68ce7ee4, traits: COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_QXL,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE42,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_LAN9118,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_RESCUE_BFV,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VMVGA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_SMMUV3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SCSI {{(pid=107505) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} Aug 30 14:06:26.199305 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:06:26.916194 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.717s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:06:26.922147 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:06:26.938099 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:06:26.974973 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.196s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:06:26.975846 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Start building networks asynchronously for instance. {{(pid=107505) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} Aug 30 14:06:27.054227 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Allocating IP information in the background. {{(pid=107505) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} Aug 30 14:06:27.055439 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] allocate_for_instance() {{(pid=107505) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} Aug 30 14:06:27.188713 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Aug 30 14:06:27.362545 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:27.364131 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Start building block device mappings for instance. {{(pid=107505) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} Aug 30 14:06:27.499339 np0035104604 nova-compute[107505]: DEBUG nova.policy [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '70b7977558f3470daaf99abc324238f4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '596b61c1a20b447d949fa872fbcb0082', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=107505) authorize /opt/stack/nova/nova/policy.py:203}} Aug 30 14:06:27.657766 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Start spawning the instance on the hypervisor. {{(pid=107505) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} Aug 30 14:06:27.659006 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Creating instance directory {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4717}} Aug 30 14:06:27.660887 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Creating image(s) Aug 30 14:06:27.732803 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] rbd image ad833d26-ac53-4df9-aaaf-daa2928d67f3_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:06:27.794532 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] rbd image ad833d26-ac53-4df9-aaaf-daa2928d67f3_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:06:27.842045 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] rbd image ad833d26-ac53-4df9-aaaf-daa2928d67f3_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:06:27.847886 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e --force-share --output=json {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:06:27.871535 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-ee83323a-67c2-45f3-a719-07e7f297345a tempest-ServerDiagnosticsV248Test-1735640900 tempest-ServerDiagnosticsV248Test-1735640900-project-admin] [instance: 321b765f-5338-445c-a408-d5ee207af425] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:06:27.879677 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-ee83323a-67c2-45f3-a719-07e7f297345a tempest-ServerDiagnosticsV248Test-1735640900 tempest-ServerDiagnosticsV248Test-1735640900-project-admin] [instance: 321b765f-5338-445c-a408-d5ee207af425] Retrieving diagnostics Aug 30 14:06:27.961552 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] CMD "/opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e --force-share --output=json" returned: 0 in 0.114s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:06:27.962579 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Acquiring lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:06:27.963401 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:06:27.964007 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:06:28.011927 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] rbd image ad833d26-ac53-4df9-aaaf-daa2928d67f3_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:06:28.025830 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e ad833d26-ac53-4df9-aaaf-daa2928d67f3_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:06:28.185141 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-e7501cc4-e619-4bba-8baf-32511e70f8ce tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Acquiring lock "321b765f-5338-445c-a408-d5ee207af425" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:06:28.185581 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-e7501cc4-e619-4bba-8baf-32511e70f8ce tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Lock "321b765f-5338-445c-a408-d5ee207af425" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:06:28.185960 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-e7501cc4-e619-4bba-8baf-32511e70f8ce tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Acquiring lock "321b765f-5338-445c-a408-d5ee207af425-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:06:28.186281 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-e7501cc4-e619-4bba-8baf-32511e70f8ce tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Lock "321b765f-5338-445c-a408-d5ee207af425-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:06:28.186555 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-e7501cc4-e619-4bba-8baf-32511e70f8ce tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Lock "321b765f-5338-445c-a408-d5ee207af425-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:06:28.192757 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e7501cc4-e619-4bba-8baf-32511e70f8ce tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] [instance: 321b765f-5338-445c-a408-d5ee207af425] Terminating instance Aug 30 14:06:28.194409 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-e7501cc4-e619-4bba-8baf-32511e70f8ce tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Acquiring lock "refresh_cache-321b765f-5338-445c-a408-d5ee207af425" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:06:28.194409 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-e7501cc4-e619-4bba-8baf-32511e70f8ce tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Acquired lock "refresh_cache-321b765f-5338-445c-a408-d5ee207af425" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:06:28.194409 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-e7501cc4-e619-4bba-8baf-32511e70f8ce tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] [instance: 321b765f-5338-445c-a408-d5ee207af425] Building network info cache for instance {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} Aug 30 14:06:28.459861 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-e7501cc4-e619-4bba-8baf-32511e70f8ce tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] [instance: 321b765f-5338-445c-a408-d5ee207af425] Instance cache missing network info. {{(pid=107505) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} Aug 30 14:06:28.649822 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Successfully created port: b2c6e252-54fa-4458-8f57-5d9dda91f621 {{(pid=107505) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} Aug 30 14:06:28.797502 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] CMD "rbd import --pool vms /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e ad833d26-ac53-4df9-aaaf-daa2928d67f3_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.777s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:06:29.197856 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] resizing rbd image ad833d26-ac53-4df9-aaaf-daa2928d67f3_disk to 1073741824 {{(pid=107505) resize /opt/stack/nova/nova/storage/rbd_utils.py:288}} Aug 30 14:06:29.390642 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-e7501cc4-e619-4bba-8baf-32511e70f8ce tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] [instance: 321b765f-5338-445c-a408-d5ee207af425] Updating instance_info_cache with network_info: [] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:06:29.420986 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Created local disks {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4851}} Aug 30 14:06:29.420986 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Ensure instance console log exists: /opt/stack/data/nova/instances/ad833d26-ac53-4df9-aaaf-daa2928d67f3/console.log {{(pid=107505) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4603}} Aug 30 14:06:29.422325 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:06:29.422325 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:06:29.422664 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:06:29.424508 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-e7501cc4-e619-4bba-8baf-32511e70f8ce tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Releasing lock "refresh_cache-321b765f-5338-445c-a408-d5ee207af425" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:06:29.425716 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e7501cc4-e619-4bba-8baf-32511e70f8ce tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] [instance: 321b765f-5338-445c-a408-d5ee207af425] Start destroying the instance on the hypervisor. {{(pid=107505) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} Aug 30 14:06:29.666242 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: 321b765f-5338-445c-a408-d5ee207af425] Instance destroyed successfully. Aug 30 14:06:29.667122 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-e7501cc4-e619-4bba-8baf-32511e70f8ce tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Lazy-loading 'resources' on Instance uuid 321b765f-5338-445c-a408-d5ee207af425 {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:06:29.948146 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Successfully updated port: b2c6e252-54fa-4458-8f57-5d9dda91f621 {{(pid=107505) _update_port /opt/stack/nova/nova/network/neutron.py:586}} Aug 30 14:06:29.966083 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Acquiring lock "refresh_cache-ad833d26-ac53-4df9-aaaf-daa2928d67f3" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:06:29.966652 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Acquired lock "refresh_cache-ad833d26-ac53-4df9-aaaf-daa2928d67f3" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:06:29.967071 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Building network info cache for instance {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} Aug 30 14:06:30.137963 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-43523443-a99d-49ba-82db-b078e05b2a53 req-5f231d9f-de00-4750-ae86-0c3d26eed890 service nova] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Received event network-changed-b2c6e252-54fa-4458-8f57-5d9dda91f621 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:06:30.138472 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-43523443-a99d-49ba-82db-b078e05b2a53 req-5f231d9f-de00-4750-ae86-0c3d26eed890 service nova] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Refreshing instance network info cache due to event network-changed-b2c6e252-54fa-4458-8f57-5d9dda91f621. {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11037}} Aug 30 14:06:30.138895 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-43523443-a99d-49ba-82db-b078e05b2a53 req-5f231d9f-de00-4750-ae86-0c3d26eed890 service nova] Acquiring lock "refresh_cache-ad833d26-ac53-4df9-aaaf-daa2928d67f3" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:06:30.172214 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Instance cache missing network info. {{(pid=107505) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} Aug 30 14:06:30.331833 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-e7501cc4-e619-4bba-8baf-32511e70f8ce tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] [instance: 321b765f-5338-445c-a408-d5ee207af425] Deleting instance files /opt/stack/data/nova/instances/321b765f-5338-445c-a408-d5ee207af425_del Aug 30 14:06:30.332452 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-e7501cc4-e619-4bba-8baf-32511e70f8ce tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] [instance: 321b765f-5338-445c-a408-d5ee207af425] Deletion of /opt/stack/data/nova/instances/321b765f-5338-445c-a408-d5ee207af425_del complete Aug 30 14:06:30.545433 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e7501cc4-e619-4bba-8baf-32511e70f8ce tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Checking UEFI support for host arch (x86_64) {{(pid=107505) supports_uefi /opt/stack/nova/nova/virt/libvirt/host.py:1751}} Aug 30 14:06:30.545956 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.host [None req-e7501cc4-e619-4bba-8baf-32511e70f8ce tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] UEFI support detected Aug 30 14:06:30.549969 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e7501cc4-e619-4bba-8baf-32511e70f8ce tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] [instance: 321b765f-5338-445c-a408-d5ee207af425] Took 1.12 seconds to destroy the instance on the hypervisor. Aug 30 14:06:30.550767 np0035104604 nova-compute[107505]: DEBUG oslo.service.loopingcall [None req-e7501cc4-e619-4bba-8baf-32511e70f8ce tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=107505) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} Aug 30 14:06:30.551074 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [-] [instance: 321b765f-5338-445c-a408-d5ee207af425] Deallocating network for instance {{(pid=107505) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} Aug 30 14:06:30.551314 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [-] [instance: 321b765f-5338-445c-a408-d5ee207af425] deallocate_for_instance() {{(pid=107505) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} Aug 30 14:06:30.986460 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [-] [instance: 321b765f-5338-445c-a408-d5ee207af425] Instance cache missing network info. {{(pid=107505) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} Aug 30 14:06:31.000412 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [-] [instance: 321b765f-5338-445c-a408-d5ee207af425] Updating instance_info_cache with network_info: [] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:06:31.018649 np0035104604 nova-compute[107505]: INFO nova.compute.manager [-] [instance: 321b765f-5338-445c-a408-d5ee207af425] Took 0.47 seconds to deallocate network for instance. Aug 30 14:06:31.209295 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-e7501cc4-e619-4bba-8baf-32511e70f8ce tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:06:31.209579 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-e7501cc4-e619-4bba-8baf-32511e70f8ce tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:06:31.364703 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Updating instance_info_cache with network_info: [{"id": "b2c6e252-54fa-4458-8f57-5d9dda91f621", "address": "fa:16:3e:26:3d:98", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.181", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2c6e252-54", "ovs_interfaceid": "b2c6e252-54fa-4458-8f57-5d9dda91f621", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:06:31.386781 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Releasing lock "refresh_cache-ad833d26-ac53-4df9-aaaf-daa2928d67f3" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:06:31.387498 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Instance network_info: |[{"id": "b2c6e252-54fa-4458-8f57-5d9dda91f621", "address": "fa:16:3e:26:3d:98", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.181", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2c6e252-54", "ovs_interfaceid": "b2c6e252-54fa-4458-8f57-5d9dda91f621", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=107505) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} Aug 30 14:06:31.388697 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-43523443-a99d-49ba-82db-b078e05b2a53 req-5f231d9f-de00-4750-ae86-0c3d26eed890 service nova] Acquired lock "refresh_cache-ad833d26-ac53-4df9-aaaf-daa2928d67f3" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:06:31.389785 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-43523443-a99d-49ba-82db-b078e05b2a53 req-5f231d9f-de00-4750-ae86-0c3d26eed890 service nova] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Refreshing network info cache for port b2c6e252-54fa-4458-8f57-5d9dda91f621 {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} Aug 30 14:06:31.393299 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Start _get_guest_xml network_info=[{"id": "b2c6e252-54fa-4458-8f57-5d9dda91f621", "address": "fa:16:3e:26:3d:98", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.181", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2c6e252-54", "ovs_interfaceid": "b2c6e252-54fa-4458-8f57-5d9dda91f621", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'ide', 'dev': 'hda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-08-30T14:01:32Z,direct_url=,disk_format='qcow2',id=a32d892d-d3ba-4b7e-9bab-7e06f730b9e0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='7f1ed4be4a61468fa9c662d2b5fbcb56',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-08-30T14:01:43Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'disk_bus': 'virtio', 'image_id': 'a32d892d-d3ba-4b7e-9bab-7e06f730b9e0'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=107505) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7553}} Aug 30 14:06:31.399905 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Aug 30 14:06:31.403904 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Searching host: 'np0035104604' for CPU controller through CGroups V1... {{(pid=107505) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1650}} Aug 30 14:06:31.404921 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] CPU controller missing on host. {{(pid=107505) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1660}} Aug 30 14:06:31.661030 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Searching host: 'np0035104604' for CPU controller through CGroups V2... {{(pid=107505) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1669}} Aug 30 14:06:31.661393 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] CPU controller found on host. {{(pid=107505) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1676}} Aug 30 14:06:31.663862 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=107505) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5390}} Aug 30 14:06:31.664896 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-08-30T14:02:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-08-30T14:01:32Z,direct_url=,disk_format='qcow2',id=a32d892d-d3ba-4b7e-9bab-7e06f730b9e0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='7f1ed4be4a61468fa9c662d2b5fbcb56',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-08-30T14:01:43Z,virtual_size=,visibility=), allow threads: True {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Aug 30 14:06:31.665103 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Flavor limits 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Aug 30 14:06:31.665308 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Image limits 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Aug 30 14:06:31.672694 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Flavor pref 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Aug 30 14:06:31.672694 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Image pref 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Aug 30 14:06:31.672694 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Aug 30 14:06:31.672694 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Aug 30 14:06:31.672694 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=107505) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Aug 30 14:06:31.672694 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Got 1 possible topologies {{(pid=107505) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Aug 30 14:06:31.672694 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Aug 30 14:06:31.672694 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Aug 30 14:06:31.687242 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:06:32.261402 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-e7501cc4-e619-4bba-8baf-32511e70f8ce tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:06:32.492897 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:32.556913 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.870s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:06:32.743246 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] rbd image ad833d26-ac53-4df9-aaaf-daa2928d67f3_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:06:32.757513 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:06:33.033238 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-e7501cc4-e619-4bba-8baf-32511e70f8ce tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.771s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:06:33.209804 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-e7501cc4-e619-4bba-8baf-32511e70f8ce tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:06:33.229789 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-e7501cc4-e619-4bba-8baf-32511e70f8ce tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:06:33.266159 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-e7501cc4-e619-4bba-8baf-32511e70f8ce tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 2.056s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:06:33.401808 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-43523443-a99d-49ba-82db-b078e05b2a53 req-5f231d9f-de00-4750-ae86-0c3d26eed890 service nova] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Updated VIF entry in instance network info cache for port b2c6e252-54fa-4458-8f57-5d9dda91f621. {{(pid=107505) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} Aug 30 14:06:33.402473 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-43523443-a99d-49ba-82db-b078e05b2a53 req-5f231d9f-de00-4750-ae86-0c3d26eed890 service nova] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Updating instance_info_cache with network_info: [{"id": "b2c6e252-54fa-4458-8f57-5d9dda91f621", "address": "fa:16:3e:26:3d:98", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.181", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2c6e252-54", "ovs_interfaceid": "b2c6e252-54fa-4458-8f57-5d9dda91f621", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:06:33.419220 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-43523443-a99d-49ba-82db-b078e05b2a53 req-5f231d9f-de00-4750-ae86-0c3d26eed890 service nova] Releasing lock "refresh_cache-ad833d26-ac53-4df9-aaaf-daa2928d67f3" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:06:33.457193 np0035104604 nova-compute[107505]: INFO nova.scheduler.client.report [None req-e7501cc4-e619-4bba-8baf-32511e70f8ce tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Deleted allocations for instance 321b765f-5338-445c-a408-d5ee207af425 Aug 30 14:06:33.572952 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.816s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:06:33.575622 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-08-30T14:06:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiagnosticsTest-server-58398265',display_name='tempest-ServerDiagnosticsTest-server-58398265',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-serverdiagnosticstest-server-58398265',id=3,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='596b61c1a20b447d949fa872fbcb0082',ramdisk_id='',reservation_id='r-6zwo3d8s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerDiagnosticsTest-1789236084',owner_user_name='tempest-ServerDiagnosticsTest-1789236084-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-08-30T14:06:27Z,user_data=None,user_id='70b7977558f3470daaf99abc324238f4',uuid=ad833d26-ac53-4df9-aaaf-daa2928d67f3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b2c6e252-54fa-4458-8f57-5d9dda91f621", "address": "fa:16:3e:26:3d:98", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.181", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2c6e252-54", "ovs_interfaceid": "b2c6e252-54fa-4458-8f57-5d9dda91f621", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=qemu {{(pid=107505) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Aug 30 14:06:33.576120 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Converting VIF {"id": "b2c6e252-54fa-4458-8f57-5d9dda91f621", "address": "fa:16:3e:26:3d:98", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.181", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2c6e252-54", "ovs_interfaceid": "b2c6e252-54fa-4458-8f57-5d9dda91f621", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:06:33.577184 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:26:3d:98,bridge_name='br-int',has_traffic_filtering=True,id=b2c6e252-54fa-4458-8f57-5d9dda91f621,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb2c6e252-54') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:06:33.578523 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Lazy-loading 'pci_devices' on Instance uuid ad833d26-ac53-4df9-aaaf-daa2928d67f3 {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:06:33.584486 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-e7501cc4-e619-4bba-8baf-32511e70f8ce tempest-ServerDiagnosticsV248Test-334985725 tempest-ServerDiagnosticsV248Test-334985725-project-member] Lock "321b765f-5338-445c-a408-d5ee207af425" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 5.399s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] End _get_guest_xml xml= Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: ad833d26-ac53-4df9-aaaf-daa2928d67f3 Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: instance-00000003 Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: 131072 Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: 1 Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: tempest-ServerDiagnosticsTest-server-58398265 Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: 2023-08-30 14:06:31 Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: 128 Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: 1 Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: 0 Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: 0 Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: 1 Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: tempest-ServerDiagnosticsTest-1789236084-project-member Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: tempest-ServerDiagnosticsTest-1789236084 Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: OpenStack Foundation Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: OpenStack Nova Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: 27.1.0 Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: ad833d26-ac53-4df9-aaaf-daa2928d67f3 Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: ad833d26-ac53-4df9-aaaf-daa2928d67f3 Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Virtual Machine Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: hvm Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Nehalem Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: /dev/urandom Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: Aug 30 14:06:33.594313 np0035104604 nova-compute[107505]: {{(pid=107505) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7559}} Aug 30 14:06:33.603508 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Preparing to wait for external event network-vif-plugged-b2c6e252-54fa-4458-8f57-5d9dda91f621 {{(pid=107505) prepare_for_instance_event /opt/stack/nova/nova/compute/manager.py:283}} Aug 30 14:06:33.603508 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Acquiring lock "ad833d26-ac53-4df9-aaaf-daa2928d67f3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:06:33.603508 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Lock "ad833d26-ac53-4df9-aaaf-daa2928d67f3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:06:33.603508 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Lock "ad833d26-ac53-4df9-aaaf-daa2928d67f3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:06:33.603508 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-08-30T14:06:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiagnosticsTest-server-58398265',display_name='tempest-ServerDiagnosticsTest-server-58398265',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-serverdiagnosticstest-server-58398265',id=3,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='596b61c1a20b447d949fa872fbcb0082',ramdisk_id='',reservation_id='r-6zwo3d8s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerDiagnosticsTest-1789236084',owner_user_name='tempest-ServerDiagnosticsTest-1789236084-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-08-30T14:06:27Z,user_data=None,user_id='70b7977558f3470daaf99abc324238f4',uuid=ad833d26-ac53-4df9-aaaf-daa2928d67f3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b2c6e252-54fa-4458-8f57-5d9dda91f621", "address": "fa:16:3e:26:3d:98", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.181", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2c6e252-54", "ovs_interfaceid": "b2c6e252-54fa-4458-8f57-5d9dda91f621", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Aug 30 14:06:33.604779 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Converting VIF {"id": "b2c6e252-54fa-4458-8f57-5d9dda91f621", "address": "fa:16:3e:26:3d:98", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.181", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2c6e252-54", "ovs_interfaceid": "b2c6e252-54fa-4458-8f57-5d9dda91f621", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:06:33.604779 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:26:3d:98,bridge_name='br-int',has_traffic_filtering=True,id=b2c6e252-54fa-4458-8f57-5d9dda91f621,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb2c6e252-54') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:06:33.604779 np0035104604 nova-compute[107505]: DEBUG os_vif [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:3d:98,bridge_name='br-int',has_traffic_filtering=True,id=b2c6e252-54fa-4458-8f57-5d9dda91f621,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb2c6e252-54') {{(pid=107505) plug /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:76}} Aug 30 14:06:33.604779 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:33.604779 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:06:33.604779 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Aug 30 14:06:33.610119 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:33.610119 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb2c6e252-54, may_exist=True, interface_attrs={}) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:06:33.610119 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb2c6e252-54, col_values=(('external_ids', {'iface-id': 'b2c6e252-54fa-4458-8f57-5d9dda91f621', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:26:3d:98', 'vm-uuid': 'ad833d26-ac53-4df9-aaaf-daa2928d67f3'}),), if_exists=True) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:06:33.611679 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:33.618037 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:06:33.620046 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:33.622041 np0035104604 nova-compute[107505]: INFO os_vif [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:3d:98,bridge_name='br-int',has_traffic_filtering=True,id=b2c6e252-54fa-4458-8f57-5d9dda91f621,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb2c6e252-54') Aug 30 14:06:33.633643 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:33.667434 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] No BDM found with device name vda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:06:33.667745 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] No BDM found with device name hda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:06:33.668049 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] No VIF found with MAC fa:16:3e:26:3d:98, not building metadata {{(pid=107505) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12068}} Aug 30 14:06:33.668879 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Using config drive Aug 30 14:06:33.699940 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] rbd image ad833d26-ac53-4df9-aaaf-daa2928d67f3_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:06:34.092754 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Creating config drive at /opt/stack/data/nova/instances/ad833d26-ac53-4df9-aaaf-daa2928d67f3/disk.config Aug 30 14:06:34.097005 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Running cmd (subprocess): genisoimage -o /opt/stack/data/nova/instances/ad833d26-ac53-4df9-aaaf-daa2928d67f3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 27.1.0 -quiet -J -r -V config-2 /tmp/tmpt9ay9jdv {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:06:34.123796 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] CMD "genisoimage -o /opt/stack/data/nova/instances/ad833d26-ac53-4df9-aaaf-daa2928d67f3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 27.1.0 -quiet -J -r -V config-2 /tmp/tmpt9ay9jdv" returned: 0 in 0.026s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:06:34.164204 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] rbd image ad833d26-ac53-4df9-aaaf-daa2928d67f3_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:06:34.174492 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/ad833d26-ac53-4df9-aaaf-daa2928d67f3/disk.config ad833d26-ac53-4df9-aaaf-daa2928d67f3_disk.config --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:06:34.412790 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] CMD "rbd import --pool vms /opt/stack/data/nova/instances/ad833d26-ac53-4df9-aaaf-daa2928d67f3/disk.config ad833d26-ac53-4df9-aaaf-daa2928d67f3_disk.config --image-format=2 --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.238s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:06:34.413633 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Deleting local config drive /opt/stack/data/nova/instances/ad833d26-ac53-4df9-aaaf-daa2928d67f3/disk.config because it was imported into RBD. Aug 30 14:06:34.440465 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:34.492255 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:34.492255 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:34.732268 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-c33bac2b-084b-4499-b7ea-983ae9998aa3 req-41966b69-81ec-4ff8-823a-643bc2140d8e service nova] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Received event network-vif-plugged-b2c6e252-54fa-4458-8f57-5d9dda91f621 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:06:34.732987 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-c33bac2b-084b-4499-b7ea-983ae9998aa3 req-41966b69-81ec-4ff8-823a-643bc2140d8e service nova] Acquiring lock "ad833d26-ac53-4df9-aaaf-daa2928d67f3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:06:34.733816 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-c33bac2b-084b-4499-b7ea-983ae9998aa3 req-41966b69-81ec-4ff8-823a-643bc2140d8e service nova] Lock "ad833d26-ac53-4df9-aaaf-daa2928d67f3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:06:34.734117 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-c33bac2b-084b-4499-b7ea-983ae9998aa3 req-41966b69-81ec-4ff8-823a-643bc2140d8e service nova] Lock "ad833d26-ac53-4df9-aaaf-daa2928d67f3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:06:34.734526 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-c33bac2b-084b-4499-b7ea-983ae9998aa3 req-41966b69-81ec-4ff8-823a-643bc2140d8e service nova] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Processing event network-vif-plugged-b2c6e252-54fa-4458-8f57-5d9dda91f621 {{(pid=107505) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10792}} Aug 30 14:06:34.947332 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:34.962778 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:34.977384 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:35.533064 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Started> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:06:35.533404 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] VM Started (Lifecycle Event) Aug 30 14:06:35.538551 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Instance event wait completed in 0 seconds for network-vif-plugged {{(pid=107505) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Aug 30 14:06:35.546740 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Guest created on hypervisor {{(pid=107505) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4411}} Aug 30 14:06:35.549924 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Instance spawned successfully. Aug 30 14:06:35.556786 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:894}} Aug 30 14:06:35.573366 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:06:35.584109 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:06:35.588837 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Found default for hw_cdrom_bus of ide {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:06:35.589184 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Found default for hw_disk_bus of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:06:35.590067 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Found default for hw_input_bus of None {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:06:35.590742 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Found default for hw_pointer_model of None {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:06:35.591619 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Found default for hw_video_model of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:06:35.592285 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Found default for hw_vif_model of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:06:35.607857 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] During sync_power_state the instance has a pending task (spawning). Skip. Aug 30 14:06:35.608292 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Paused> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:06:35.608642 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] VM Paused (Lifecycle Event) Aug 30 14:06:35.631582 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:06:35.639093 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Resumed> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:06:35.639421 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] VM Resumed (Lifecycle Event) Aug 30 14:06:35.681120 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:06:35.686083 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:06:35.713024 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Took 8.05 seconds to spawn the instance on the hypervisor. Aug 30 14:06:35.713740 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:06:35.716038 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] During sync_power_state the instance has a pending task (spawning). Skip. Aug 30 14:06:35.835025 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Took 11.43 seconds to build instance. Aug 30 14:06:35.857459 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-43532dbd-c7c8-452b-9ebb-e366eca78c91 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Lock "ad833d26-ac53-4df9-aaaf-daa2928d67f3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 11.541s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:06:36.803613 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-4302f8ad-6f79-4b15-b426-7fcf24064008 req-cfd6eaaf-0d1d-44f6-b214-b097e524248e service nova] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Received event network-vif-plugged-b2c6e252-54fa-4458-8f57-5d9dda91f621 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:06:36.803992 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-4302f8ad-6f79-4b15-b426-7fcf24064008 req-cfd6eaaf-0d1d-44f6-b214-b097e524248e service nova] Acquiring lock "ad833d26-ac53-4df9-aaaf-daa2928d67f3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:06:36.804399 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-4302f8ad-6f79-4b15-b426-7fcf24064008 req-cfd6eaaf-0d1d-44f6-b214-b097e524248e service nova] Lock "ad833d26-ac53-4df9-aaaf-daa2928d67f3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:06:36.808985 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-4302f8ad-6f79-4b15-b426-7fcf24064008 req-cfd6eaaf-0d1d-44f6-b214-b097e524248e service nova] Lock "ad833d26-ac53-4df9-aaaf-daa2928d67f3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:06:36.808985 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-4302f8ad-6f79-4b15-b426-7fcf24064008 req-cfd6eaaf-0d1d-44f6-b214-b097e524248e service nova] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] No waiting events found dispatching network-vif-plugged-b2c6e252-54fa-4458-8f57-5d9dda91f621 {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:06:36.810790 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [req-4302f8ad-6f79-4b15-b426-7fcf24064008 req-cfd6eaaf-0d1d-44f6-b214-b097e524248e service nova] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Received unexpected event network-vif-plugged-b2c6e252-54fa-4458-8f57-5d9dda91f621 for instance with vm_state active and task_state None. Aug 30 14:06:37.238050 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-9ccbe6d7-7b5e-4ba8-85a6-170201e1db1d tempest-ServerDiagnosticsTest-1948173886 tempest-ServerDiagnosticsTest-1948173886-project-admin] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:06:37.259037 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-9ccbe6d7-7b5e-4ba8-85a6-170201e1db1d tempest-ServerDiagnosticsTest-1948173886 tempest-ServerDiagnosticsTest-1948173886-project-admin] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Retrieving diagnostics Aug 30 14:06:37.526033 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-7a6de218-6f6d-408f-9652-2a0b925cfff0 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Acquiring lock "ad833d26-ac53-4df9-aaaf-daa2928d67f3" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:06:37.526473 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-7a6de218-6f6d-408f-9652-2a0b925cfff0 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Lock "ad833d26-ac53-4df9-aaaf-daa2928d67f3" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:06:37.527071 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-7a6de218-6f6d-408f-9652-2a0b925cfff0 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Acquiring lock "ad833d26-ac53-4df9-aaaf-daa2928d67f3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:06:37.527339 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-7a6de218-6f6d-408f-9652-2a0b925cfff0 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Lock "ad833d26-ac53-4df9-aaaf-daa2928d67f3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:06:37.527758 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-7a6de218-6f6d-408f-9652-2a0b925cfff0 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Lock "ad833d26-ac53-4df9-aaaf-daa2928d67f3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:06:37.531263 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-7a6de218-6f6d-408f-9652-2a0b925cfff0 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Terminating instance Aug 30 14:06:37.535136 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-7a6de218-6f6d-408f-9652-2a0b925cfff0 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Start destroying the instance on the hypervisor. {{(pid=107505) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} Aug 30 14:06:37.594929 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:37.610405 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:37.624589 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:37.647874 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:37.650324 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:37.806974 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Instance destroyed successfully. Aug 30 14:06:37.807449 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-7a6de218-6f6d-408f-9652-2a0b925cfff0 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Lazy-loading 'resources' on Instance uuid ad833d26-ac53-4df9-aaaf-daa2928d67f3 {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:06:37.825852 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-7a6de218-6f6d-408f-9652-2a0b925cfff0 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2023-08-30T14:06:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerDiagnosticsTest-server-58398265',display_name='tempest-ServerDiagnosticsTest-server-58398265',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-serverdiagnosticstest-server-58398265',id=3,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-08-30T14:06:35Z,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='596b61c1a20b447d949fa872fbcb0082',ramdisk_id='',reservation_id='r-6zwo3d8s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerDiagnosticsTest-1789236084',owner_user_name='tempest-ServerDiagnosticsTest-1789236084-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-08-30T14:06:36Z,user_data=None,user_id='70b7977558f3470daaf99abc324238f4',uuid=ad833d26-ac53-4df9-aaaf-daa2928d67f3,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b2c6e252-54fa-4458-8f57-5d9dda91f621", "address": "fa:16:3e:26:3d:98", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.181", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2c6e252-54", "ovs_interfaceid": "b2c6e252-54fa-4458-8f57-5d9dda91f621", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Aug 30 14:06:37.828977 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-7a6de218-6f6d-408f-9652-2a0b925cfff0 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Converting VIF {"id": "b2c6e252-54fa-4458-8f57-5d9dda91f621", "address": "fa:16:3e:26:3d:98", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.181", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2c6e252-54", "ovs_interfaceid": "b2c6e252-54fa-4458-8f57-5d9dda91f621", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:06:37.828977 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-7a6de218-6f6d-408f-9652-2a0b925cfff0 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:26:3d:98,bridge_name='br-int',has_traffic_filtering=True,id=b2c6e252-54fa-4458-8f57-5d9dda91f621,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb2c6e252-54') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:06:37.828977 np0035104604 nova-compute[107505]: DEBUG os_vif [None req-7a6de218-6f6d-408f-9652-2a0b925cfff0 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:3d:98,bridge_name='br-int',has_traffic_filtering=True,id=b2c6e252-54fa-4458-8f57-5d9dda91f621,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb2c6e252-54') {{(pid=107505) unplug /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:109}} Aug 30 14:06:37.831167 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:37.831729 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb2c6e252-54, bridge=br-int, if_exists=True) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:06:37.834079 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:37.837619 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:06:37.842432 np0035104604 nova-compute[107505]: INFO os_vif [None req-7a6de218-6f6d-408f-9652-2a0b925cfff0 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:3d:98,bridge_name='br-int',has_traffic_filtering=True,id=b2c6e252-54fa-4458-8f57-5d9dda91f621,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb2c6e252-54') Aug 30 14:06:38.238338 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-7a6de218-6f6d-408f-9652-2a0b925cfff0 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Deleting instance files /opt/stack/data/nova/instances/ad833d26-ac53-4df9-aaaf-daa2928d67f3_del Aug 30 14:06:38.239064 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-7a6de218-6f6d-408f-9652-2a0b925cfff0 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Deletion of /opt/stack/data/nova/instances/ad833d26-ac53-4df9-aaaf-daa2928d67f3_del complete Aug 30 14:06:38.316902 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-7a6de218-6f6d-408f-9652-2a0b925cfff0 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Took 0.78 seconds to destroy the instance on the hypervisor. Aug 30 14:06:38.317803 np0035104604 nova-compute[107505]: DEBUG oslo.service.loopingcall [None req-7a6de218-6f6d-408f-9652-2a0b925cfff0 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=107505) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} Aug 30 14:06:38.317956 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [-] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Deallocating network for instance {{(pid=107505) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} Aug 30 14:06:38.318266 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [-] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] deallocate_for_instance() {{(pid=107505) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} Aug 30 14:06:38.666690 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:38.889537 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-12e19976-3da3-4225-95d4-93c06840eaff req-a5c52596-ce66-4a8d-b4e8-fd04fdb5df93 service nova] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Received event network-vif-unplugged-b2c6e252-54fa-4458-8f57-5d9dda91f621 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:06:38.890902 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-12e19976-3da3-4225-95d4-93c06840eaff req-a5c52596-ce66-4a8d-b4e8-fd04fdb5df93 service nova] Acquiring lock "ad833d26-ac53-4df9-aaaf-daa2928d67f3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:06:38.891500 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-12e19976-3da3-4225-95d4-93c06840eaff req-a5c52596-ce66-4a8d-b4e8-fd04fdb5df93 service nova] Lock "ad833d26-ac53-4df9-aaaf-daa2928d67f3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:06:38.891988 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-12e19976-3da3-4225-95d4-93c06840eaff req-a5c52596-ce66-4a8d-b4e8-fd04fdb5df93 service nova] Lock "ad833d26-ac53-4df9-aaaf-daa2928d67f3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:06:38.892575 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-12e19976-3da3-4225-95d4-93c06840eaff req-a5c52596-ce66-4a8d-b4e8-fd04fdb5df93 service nova] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] No waiting events found dispatching network-vif-unplugged-b2c6e252-54fa-4458-8f57-5d9dda91f621 {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:06:38.893068 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-12e19976-3da3-4225-95d4-93c06840eaff req-a5c52596-ce66-4a8d-b4e8-fd04fdb5df93 service nova] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Received event network-vif-unplugged-b2c6e252-54fa-4458-8f57-5d9dda91f621 for instance with task_state deleting. {{(pid=107505) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10810}} Aug 30 14:06:38.893642 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-12e19976-3da3-4225-95d4-93c06840eaff req-a5c52596-ce66-4a8d-b4e8-fd04fdb5df93 service nova] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Received event network-vif-plugged-b2c6e252-54fa-4458-8f57-5d9dda91f621 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:06:38.894154 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-12e19976-3da3-4225-95d4-93c06840eaff req-a5c52596-ce66-4a8d-b4e8-fd04fdb5df93 service nova] Acquiring lock "ad833d26-ac53-4df9-aaaf-daa2928d67f3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:06:38.894627 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-12e19976-3da3-4225-95d4-93c06840eaff req-a5c52596-ce66-4a8d-b4e8-fd04fdb5df93 service nova] Lock "ad833d26-ac53-4df9-aaaf-daa2928d67f3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:06:38.895014 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-12e19976-3da3-4225-95d4-93c06840eaff req-a5c52596-ce66-4a8d-b4e8-fd04fdb5df93 service nova] Lock "ad833d26-ac53-4df9-aaaf-daa2928d67f3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:06:38.895370 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-12e19976-3da3-4225-95d4-93c06840eaff req-a5c52596-ce66-4a8d-b4e8-fd04fdb5df93 service nova] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] No waiting events found dispatching network-vif-plugged-b2c6e252-54fa-4458-8f57-5d9dda91f621 {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:06:38.895817 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [req-12e19976-3da3-4225-95d4-93c06840eaff req-a5c52596-ce66-4a8d-b4e8-fd04fdb5df93 service nova] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Received unexpected event network-vif-plugged-b2c6e252-54fa-4458-8f57-5d9dda91f621 for instance with vm_state active and task_state deleting. Aug 30 14:06:39.195340 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [-] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Updating instance_info_cache with network_info: [] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:06:39.211552 np0035104604 nova-compute[107505]: INFO nova.compute.manager [-] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Took 0.89 seconds to deallocate network for instance. Aug 30 14:06:39.274578 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-7a6de218-6f6d-408f-9652-2a0b925cfff0 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:06:39.275282 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-7a6de218-6f6d-408f-9652-2a0b925cfff0 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:06:39.770477 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-7a6de218-6f6d-408f-9652-2a0b925cfff0 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:06:40.458609 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-7a6de218-6f6d-408f-9652-2a0b925cfff0 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.687s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:06:40.464493 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-7a6de218-6f6d-408f-9652-2a0b925cfff0 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:06:40.498649 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-7a6de218-6f6d-408f-9652-2a0b925cfff0 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:06:40.538169 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-7a6de218-6f6d-408f-9652-2a0b925cfff0 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.263s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:06:40.687997 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Acquiring lock "a9c92dff-0930-42ad-92e6-4be1bd8b360f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:06:40.688309 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Lock "a9c92dff-0930-42ad-92e6-4be1bd8b360f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:06:40.709834 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Starting instance... {{(pid=107505) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} Aug 30 14:06:40.719201 np0035104604 nova-compute[107505]: INFO nova.scheduler.client.report [None req-7a6de218-6f6d-408f-9652-2a0b925cfff0 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Deleted allocations for instance ad833d26-ac53-4df9-aaaf-daa2928d67f3 Aug 30 14:06:40.922460 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-7a6de218-6f6d-408f-9652-2a0b925cfff0 tempest-ServerDiagnosticsTest-1789236084 tempest-ServerDiagnosticsTest-1789236084-project-member] Lock "ad833d26-ac53-4df9-aaaf-daa2928d67f3" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 3.396s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:06:40.934894 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-f4bcf2f1-99a4-47ab-96e0-df4d619da06b req-c0c0546a-0e1e-4588-902d-c7825412c3d6 service nova] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Received event network-vif-deleted-b2c6e252-54fa-4458-8f57-5d9dda91f621 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:06:40.949187 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:06:40.950141 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:06:40.959263 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=107505) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Aug 30 14:06:40.959972 np0035104604 nova-compute[107505]: INFO nova.compute.claims [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Claim successful on node np0035104604 Aug 30 14:06:41.437287 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:06:42.119078 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.681s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:06:42.126774 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:06:42.147278 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:06:42.194067 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.244s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:06:42.195385 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Start building networks asynchronously for instance. {{(pid=107505) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} Aug 30 14:06:42.264971 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Allocating IP information in the background. {{(pid=107505) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} Aug 30 14:06:42.265689 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] allocate_for_instance() {{(pid=107505) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} Aug 30 14:06:42.384296 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Aug 30 14:06:42.424907 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Start building block device mappings for instance. {{(pid=107505) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} Aug 30 14:06:42.642459 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Start spawning the instance on the hypervisor. {{(pid=107505) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} Aug 30 14:06:42.643714 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Creating instance directory {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4717}} Aug 30 14:06:42.644161 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Creating image(s) Aug 30 14:06:42.671740 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] rbd image a9c92dff-0930-42ad-92e6-4be1bd8b360f_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:06:42.696595 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] rbd image a9c92dff-0930-42ad-92e6-4be1bd8b360f_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:06:42.727003 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] rbd image a9c92dff-0930-42ad-92e6-4be1bd8b360f_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:06:42.730904 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e --force-share --output=json {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:06:42.753370 np0035104604 nova-compute[107505]: DEBUG nova.policy [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '578503f281764af4a8b61eee23a5e34e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bb4d3500105449b4a4ad968e9fc83533', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=107505) authorize /opt/stack/nova/nova/policy.py:203}} Aug 30 14:06:42.888742 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:42.890642 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] CMD "/opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e --force-share --output=json" returned: 0 in 0.160s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:06:42.894616 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Acquiring lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:06:42.895245 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:06:42.895633 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:06:42.921189 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] rbd image a9c92dff-0930-42ad-92e6-4be1bd8b360f_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:06:42.924358 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e a9c92dff-0930-42ad-92e6-4be1bd8b360f_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:06:43.319113 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] CMD "rbd import --pool vms /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e a9c92dff-0930-42ad-92e6-4be1bd8b360f_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.394s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:06:43.391629 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] resizing rbd image a9c92dff-0930-42ad-92e6-4be1bd8b360f_disk to 1073741824 {{(pid=107505) resize /opt/stack/nova/nova/storage/rbd_utils.py:288}} Aug 30 14:06:43.528784 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Successfully created port: d51f24bc-46f5-4a40-acd6-04ad86c95955 {{(pid=107505) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} Aug 30 14:06:43.664518 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:43.675710 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Created local disks {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4851}} Aug 30 14:06:43.675989 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Ensure instance console log exists: /opt/stack/data/nova/instances/a9c92dff-0930-42ad-92e6-4be1bd8b360f/console.log {{(pid=107505) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4603}} Aug 30 14:06:43.676528 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:06:43.677214 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:06:43.677491 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:06:44.479561 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Successfully updated port: d51f24bc-46f5-4a40-acd6-04ad86c95955 {{(pid=107505) _update_port /opt/stack/nova/nova/network/neutron.py:586}} Aug 30 14:06:44.502261 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Acquiring lock "refresh_cache-a9c92dff-0930-42ad-92e6-4be1bd8b360f" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:06:44.502735 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Acquired lock "refresh_cache-a9c92dff-0930-42ad-92e6-4be1bd8b360f" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:06:44.503171 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Building network info cache for instance {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} Aug 30 14:06:44.669405 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:06:44.670174 np0035104604 nova-compute[107505]: INFO nova.compute.manager [-] [instance: 321b765f-5338-445c-a408-d5ee207af425] VM Stopped (Lifecycle Event) Aug 30 14:06:44.679994 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-a2d2751a-7780-4a77-84a7-13b93a83ed76 req-296f6ccf-0101-4b82-bc2a-171a45f3134e service nova] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Received event network-changed-d51f24bc-46f5-4a40-acd6-04ad86c95955 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:06:44.680522 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-a2d2751a-7780-4a77-84a7-13b93a83ed76 req-296f6ccf-0101-4b82-bc2a-171a45f3134e service nova] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Refreshing instance network info cache due to event network-changed-d51f24bc-46f5-4a40-acd6-04ad86c95955. {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11037}} Aug 30 14:06:44.680921 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-a2d2751a-7780-4a77-84a7-13b93a83ed76 req-296f6ccf-0101-4b82-bc2a-171a45f3134e service nova] Acquiring lock "refresh_cache-a9c92dff-0930-42ad-92e6-4be1bd8b360f" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:06:44.692898 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-b2536a54-08e9-485e-a315-37155a136557 None None] [instance: 321b765f-5338-445c-a408-d5ee207af425] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:06:44.740073 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Instance cache missing network info. {{(pid=107505) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} Aug 30 14:06:45.601938 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Updating instance_info_cache with network_info: [{"id": "d51f24bc-46f5-4a40-acd6-04ad86c95955", "address": "fa:16:3e:e0:4f:32", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.128", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd51f24bc-46", "ovs_interfaceid": "d51f24bc-46f5-4a40-acd6-04ad86c95955", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:06:45.620660 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Releasing lock "refresh_cache-a9c92dff-0930-42ad-92e6-4be1bd8b360f" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:06:45.621112 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Instance network_info: |[{"id": "d51f24bc-46f5-4a40-acd6-04ad86c95955", "address": "fa:16:3e:e0:4f:32", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.128", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd51f24bc-46", "ovs_interfaceid": "d51f24bc-46f5-4a40-acd6-04ad86c95955", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=107505) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} Aug 30 14:06:45.621685 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-a2d2751a-7780-4a77-84a7-13b93a83ed76 req-296f6ccf-0101-4b82-bc2a-171a45f3134e service nova] Acquired lock "refresh_cache-a9c92dff-0930-42ad-92e6-4be1bd8b360f" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:06:45.622072 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-a2d2751a-7780-4a77-84a7-13b93a83ed76 req-296f6ccf-0101-4b82-bc2a-171a45f3134e service nova] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Refreshing network info cache for port d51f24bc-46f5-4a40-acd6-04ad86c95955 {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} Aug 30 14:06:45.627031 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Start _get_guest_xml network_info=[{"id": "d51f24bc-46f5-4a40-acd6-04ad86c95955", "address": "fa:16:3e:e0:4f:32", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.128", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd51f24bc-46", "ovs_interfaceid": "d51f24bc-46f5-4a40-acd6-04ad86c95955", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'ide', 'dev': 'hda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-08-30T14:01:32Z,direct_url=,disk_format='qcow2',id=a32d892d-d3ba-4b7e-9bab-7e06f730b9e0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='7f1ed4be4a61468fa9c662d2b5fbcb56',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-08-30T14:01:43Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'disk_bus': 'virtio', 'image_id': 'a32d892d-d3ba-4b7e-9bab-7e06f730b9e0'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=107505) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7553}} Aug 30 14:06:45.631567 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Aug 30 14:06:45.634516 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Searching host: 'np0035104604' for CPU controller through CGroups V1... {{(pid=107505) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1650}} Aug 30 14:06:45.635210 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] CPU controller missing on host. {{(pid=107505) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1660}} Aug 30 14:06:45.636772 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Searching host: 'np0035104604' for CPU controller through CGroups V2... {{(pid=107505) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1669}} Aug 30 14:06:45.637112 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] CPU controller found on host. {{(pid=107505) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1676}} Aug 30 14:06:45.638782 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=107505) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5390}} Aug 30 14:06:45.639379 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-08-30T14:02:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-08-30T14:01:32Z,direct_url=,disk_format='qcow2',id=a32d892d-d3ba-4b7e-9bab-7e06f730b9e0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='7f1ed4be4a61468fa9c662d2b5fbcb56',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-08-30T14:01:43Z,virtual_size=,visibility=), allow threads: True {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Aug 30 14:06:45.639730 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Flavor limits 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Aug 30 14:06:45.639999 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Image limits 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Aug 30 14:06:45.640485 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Flavor pref 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Aug 30 14:06:45.640860 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Image pref 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Aug 30 14:06:45.641181 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Aug 30 14:06:45.641601 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Aug 30 14:06:45.641943 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=107505) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Aug 30 14:06:45.642270 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Got 1 possible topologies {{(pid=107505) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Aug 30 14:06:45.642585 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Aug 30 14:06:45.642941 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Aug 30 14:06:45.656895 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:06:46.310801 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.653s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:06:46.357124 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] rbd image a9c92dff-0930-42ad-92e6-4be1bd8b360f_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:06:46.361221 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:06:47.118558 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.757s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:06:47.119094 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Acquiring lock "047619d3-96a4-42df-9e90-f87c5ce4ce1c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:06:47.120235 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Lock "047619d3-96a4-42df-9e90-f87c5ce4ce1c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:06:47.123885 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-08-30T14:06:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiagnosticsNegativeTest-server-221739488',display_name='tempest-ServerDiagnosticsNegativeTest-server-221739488',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-serverdiagnosticsnegativetest-server-221739488',id=4,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bb4d3500105449b4a4ad968e9fc83533',ramdisk_id='',reservation_id='r-lna7krme',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerDiagnosticsNegativeTest-1918409125',owner_user_name='tempest-ServerDiagnosticsNegativeTest-1918409125-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-08-30T14:06:42Z,user_data=None,user_id='578503f281764af4a8b61eee23a5e34e',uuid=a9c92dff-0930-42ad-92e6-4be1bd8b360f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d51f24bc-46f5-4a40-acd6-04ad86c95955", "address": "fa:16:3e:e0:4f:32", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.128", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd51f24bc-46", "ovs_interfaceid": "d51f24bc-46f5-4a40-acd6-04ad86c95955", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=qemu {{(pid=107505) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Aug 30 14:06:47.124426 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Converting VIF {"id": "d51f24bc-46f5-4a40-acd6-04ad86c95955", "address": "fa:16:3e:e0:4f:32", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.128", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd51f24bc-46", "ovs_interfaceid": "d51f24bc-46f5-4a40-acd6-04ad86c95955", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:06:47.128878 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e0:4f:32,bridge_name='br-int',has_traffic_filtering=True,id=d51f24bc-46f5-4a40-acd6-04ad86c95955,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd51f24bc-46') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:06:47.131037 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Lazy-loading 'pci_devices' on Instance uuid a9c92dff-0930-42ad-92e6-4be1bd8b360f {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:06:47.143736 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Starting instance... {{(pid=107505) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] End _get_guest_xml xml= Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: a9c92dff-0930-42ad-92e6-4be1bd8b360f Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: instance-00000004 Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: 131072 Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: 1 Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: tempest-ServerDiagnosticsNegativeTest-server-221739488 Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: 2023-08-30 14:06:45 Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: 128 Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: 1 Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: 0 Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: 0 Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: 1 Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: tempest-ServerDiagnosticsNegativeTest-1918409125-project-member Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: tempest-ServerDiagnosticsNegativeTest-1918409125 Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: OpenStack Foundation Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: OpenStack Nova Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: 27.1.0 Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: a9c92dff-0930-42ad-92e6-4be1bd8b360f Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: a9c92dff-0930-42ad-92e6-4be1bd8b360f Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Virtual Machine Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: hvm Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Nehalem Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: /dev/urandom Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: Aug 30 14:06:47.150254 np0035104604 nova-compute[107505]: {{(pid=107505) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7559}} Aug 30 14:06:47.160798 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Preparing to wait for external event network-vif-plugged-d51f24bc-46f5-4a40-acd6-04ad86c95955 {{(pid=107505) prepare_for_instance_event /opt/stack/nova/nova/compute/manager.py:283}} Aug 30 14:06:47.160798 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Acquiring lock "a9c92dff-0930-42ad-92e6-4be1bd8b360f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:06:47.160798 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Lock "a9c92dff-0930-42ad-92e6-4be1bd8b360f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:06:47.160798 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Lock "a9c92dff-0930-42ad-92e6-4be1bd8b360f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:06:47.160798 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-08-30T14:06:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiagnosticsNegativeTest-server-221739488',display_name='tempest-ServerDiagnosticsNegativeTest-server-221739488',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-serverdiagnosticsnegativetest-server-221739488',id=4,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bb4d3500105449b4a4ad968e9fc83533',ramdisk_id='',reservation_id='r-lna7krme',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerDiagnosticsNegativeTest-1918409125',owner_user_name='tempest-ServerDiagnosticsNegativeTest-1918409125-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-08-30T14:06:42Z,user_data=None,user_id='578503f281764af4a8b61eee23a5e34e',uuid=a9c92dff-0930-42ad-92e6-4be1bd8b360f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d51f24bc-46f5-4a40-acd6-04ad86c95955", "address": "fa:16:3e:e0:4f:32", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.128", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd51f24bc-46", "ovs_interfaceid": "d51f24bc-46f5-4a40-acd6-04ad86c95955", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Aug 30 14:06:47.161807 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Converting VIF {"id": "d51f24bc-46f5-4a40-acd6-04ad86c95955", "address": "fa:16:3e:e0:4f:32", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.128", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd51f24bc-46", "ovs_interfaceid": "d51f24bc-46f5-4a40-acd6-04ad86c95955", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:06:47.161807 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e0:4f:32,bridge_name='br-int',has_traffic_filtering=True,id=d51f24bc-46f5-4a40-acd6-04ad86c95955,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd51f24bc-46') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:06:47.161807 np0035104604 nova-compute[107505]: DEBUG os_vif [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e0:4f:32,bridge_name='br-int',has_traffic_filtering=True,id=d51f24bc-46f5-4a40-acd6-04ad86c95955,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd51f24bc-46') {{(pid=107505) plug /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:76}} Aug 30 14:06:47.161807 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:47.161807 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:06:47.161807 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Aug 30 14:06:47.161807 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:47.161807 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd51f24bc-46, may_exist=True, interface_attrs={}) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:06:47.161807 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd51f24bc-46, col_values=(('external_ids', {'iface-id': 'd51f24bc-46f5-4a40-acd6-04ad86c95955', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e0:4f:32', 'vm-uuid': 'a9c92dff-0930-42ad-92e6-4be1bd8b360f'}),), if_exists=True) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:06:47.163051 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:47.166630 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:06:47.168828 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:47.170658 np0035104604 nova-compute[107505]: INFO os_vif [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e0:4f:32,bridge_name='br-int',has_traffic_filtering=True,id=d51f24bc-46f5-4a40-acd6-04ad86c95955,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd51f24bc-46') Aug 30 14:06:47.199310 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-a2d2751a-7780-4a77-84a7-13b93a83ed76 req-296f6ccf-0101-4b82-bc2a-171a45f3134e service nova] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Updated VIF entry in instance network info cache for port d51f24bc-46f5-4a40-acd6-04ad86c95955. {{(pid=107505) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} Aug 30 14:06:47.199875 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-a2d2751a-7780-4a77-84a7-13b93a83ed76 req-296f6ccf-0101-4b82-bc2a-171a45f3134e service nova] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Updating instance_info_cache with network_info: [{"id": "d51f24bc-46f5-4a40-acd6-04ad86c95955", "address": "fa:16:3e:e0:4f:32", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.128", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd51f24bc-46", "ovs_interfaceid": "d51f24bc-46f5-4a40-acd6-04ad86c95955", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:06:47.217707 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-a2d2751a-7780-4a77-84a7-13b93a83ed76 req-296f6ccf-0101-4b82-bc2a-171a45f3134e service nova] Releasing lock "refresh_cache-a9c92dff-0930-42ad-92e6-4be1bd8b360f" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:06:47.223634 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] No BDM found with device name vda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:06:47.224072 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] No BDM found with device name hda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:06:47.224395 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] No VIF found with MAC fa:16:3e:e0:4f:32, not building metadata {{(pid=107505) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12068}} Aug 30 14:06:47.225183 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Using config drive Aug 30 14:06:47.247561 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] rbd image a9c92dff-0930-42ad-92e6-4be1bd8b360f_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:06:47.520990 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:06:47.521779 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:06:47.527341 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=107505) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Aug 30 14:06:47.527587 np0035104604 nova-compute[107505]: INFO nova.compute.claims [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Claim successful on node np0035104604 Aug 30 14:06:47.778711 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Creating config drive at /opt/stack/data/nova/instances/a9c92dff-0930-42ad-92e6-4be1bd8b360f/disk.config Aug 30 14:06:47.783169 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Running cmd (subprocess): genisoimage -o /opt/stack/data/nova/instances/a9c92dff-0930-42ad-92e6-4be1bd8b360f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 27.1.0 -quiet -J -r -V config-2 /tmp/tmpufv8qfuk {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:06:47.826412 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] CMD "genisoimage -o /opt/stack/data/nova/instances/a9c92dff-0930-42ad-92e6-4be1bd8b360f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 27.1.0 -quiet -J -r -V config-2 /tmp/tmpufv8qfuk" returned: 0 in 0.042s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:06:47.857386 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] rbd image a9c92dff-0930-42ad-92e6-4be1bd8b360f_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:06:47.862123 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/a9c92dff-0930-42ad-92e6-4be1bd8b360f/disk.config a9c92dff-0930-42ad-92e6-4be1bd8b360f_disk.config --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:06:48.261779 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] CMD "rbd import --pool vms /opt/stack/data/nova/instances/a9c92dff-0930-42ad-92e6-4be1bd8b360f/disk.config a9c92dff-0930-42ad-92e6-4be1bd8b360f_disk.config --image-format=2 --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.399s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:06:48.262497 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Deleting local config drive /opt/stack/data/nova/instances/a9c92dff-0930-42ad-92e6-4be1bd8b360f/disk.config because it was imported into RBD. Aug 30 14:06:48.288478 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:48.304766 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:48.310323 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:48.316120 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:06:48.602889 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-8d6d0ee6-3b48-4862-ae01-8c5c6714d799 req-cfc20cac-e5d0-4a01-ad0e-4278f2d2065e service nova] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Received event network-vif-plugged-d51f24bc-46f5-4a40-acd6-04ad86c95955 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:06:48.603590 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-8d6d0ee6-3b48-4862-ae01-8c5c6714d799 req-cfc20cac-e5d0-4a01-ad0e-4278f2d2065e service nova] Acquiring lock "a9c92dff-0930-42ad-92e6-4be1bd8b360f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:06:48.604082 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-8d6d0ee6-3b48-4862-ae01-8c5c6714d799 req-cfc20cac-e5d0-4a01-ad0e-4278f2d2065e service nova] Lock "a9c92dff-0930-42ad-92e6-4be1bd8b360f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:06:48.604543 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-8d6d0ee6-3b48-4862-ae01-8c5c6714d799 req-cfc20cac-e5d0-4a01-ad0e-4278f2d2065e service nova] Lock "a9c92dff-0930-42ad-92e6-4be1bd8b360f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:06:48.605003 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-8d6d0ee6-3b48-4862-ae01-8c5c6714d799 req-cfc20cac-e5d0-4a01-ad0e-4278f2d2065e service nova] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Processing event network-vif-plugged-d51f24bc-46f5-4a40-acd6-04ad86c95955 {{(pid=107505) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10792}} Aug 30 14:06:48.640612 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:48.745178 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:48.745178 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:48.771218 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:49.116296 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.800s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:06:49.121970 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:06:49.137300 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:06:49.189787 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.668s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:06:49.190849 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Start building networks asynchronously for instance. {{(pid=107505) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} Aug 30 14:06:49.277990 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Allocating IP information in the background. {{(pid=107505) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} Aug 30 14:06:49.278933 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] allocate_for_instance() {{(pid=107505) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} Aug 30 14:06:49.389947 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Started> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:06:49.390447 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] VM Started (Lifecycle Event) Aug 30 14:06:49.393317 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Aug 30 14:06:49.397004 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Instance event wait completed in 0 seconds for network-vif-plugged {{(pid=107505) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Aug 30 14:06:49.413777 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:06:49.413777 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Guest created on hypervisor {{(pid=107505) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4411}} Aug 30 14:06:49.415966 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Start building block device mappings for instance. {{(pid=107505) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} Aug 30 14:06:49.421798 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:06:49.424877 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Instance spawned successfully. Aug 30 14:06:49.425327 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:894}} Aug 30 14:06:49.573841 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] During sync_power_state the instance has a pending task (spawning). Skip. Aug 30 14:06:49.574388 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Paused> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:06:49.574795 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] VM Paused (Lifecycle Event) Aug 30 14:06:49.597179 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Found default for hw_cdrom_bus of ide {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:06:49.597776 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Found default for hw_disk_bus of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:06:49.598577 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Found default for hw_input_bus of None {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:06:49.599181 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Found default for hw_pointer_model of None {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:06:49.599826 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Found default for hw_video_model of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:06:49.600551 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Found default for hw_vif_model of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:06:49.608447 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:06:49.612457 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Resumed> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:06:49.612704 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] VM Resumed (Lifecycle Event) Aug 30 14:06:49.641086 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:06:49.646688 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Start spawning the instance on the hypervisor. {{(pid=107505) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} Aug 30 14:06:49.648343 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Creating instance directory {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4717}} Aug 30 14:06:49.648898 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Creating image(s) Aug 30 14:06:49.686698 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] rbd image 047619d3-96a4-42df-9e90-f87c5ce4ce1c_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:06:49.718406 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] rbd image 047619d3-96a4-42df-9e90-f87c5ce4ce1c_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:06:49.743534 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] rbd image 047619d3-96a4-42df-9e90-f87c5ce4ce1c_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:06:49.747070 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e --force-share --output=json {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:06:49.775182 np0035104604 nova-compute[107505]: DEBUG nova.policy [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cd5ee1e9a692476486157db4419d0f8b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '741ccd49161b469dad4432ec34c48751', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=107505) authorize /opt/stack/nova/nova/policy.py:203}} Aug 30 14:06:49.781200 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Took 7.14 seconds to spawn the instance on the hypervisor. Aug 30 14:06:49.781422 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:06:49.920912 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:06:49.930361 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] CMD "/opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e --force-share --output=json" returned: 0 in 0.183s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:06:49.931384 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Acquiring lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:06:49.932475 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:06:49.933114 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:06:49.966971 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] rbd image 047619d3-96a4-42df-9e90-f87c5ce4ce1c_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:06:49.971579 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e 047619d3-96a4-42df-9e90-f87c5ce4ce1c_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:06:49.997873 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] During sync_power_state the instance has a pending task (spawning). Skip. Aug 30 14:06:50.110486 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Took 9.32 seconds to build instance. Aug 30 14:06:50.149306 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-5ce195f7-ee03-44ca-824a-326b021aeb25 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Lock "a9c92dff-0930-42ad-92e6-4be1bd8b360f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 9.459s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:06:50.325281 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] CMD "rbd import --pool vms /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e 047619d3-96a4-42df-9e90-f87c5ce4ce1c_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.354s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:06:50.431283 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] resizing rbd image 047619d3-96a4-42df-9e90-f87c5ce4ce1c_disk to 1073741824 {{(pid=107505) resize /opt/stack/nova/nova/storage/rbd_utils.py:288}} Aug 30 14:06:50.560394 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Created local disks {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4851}} Aug 30 14:06:50.560952 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Ensure instance console log exists: /opt/stack/data/nova/instances/047619d3-96a4-42df-9e90-f87c5ce4ce1c/console.log {{(pid=107505) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4603}} Aug 30 14:06:50.561495 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:06:50.562131 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:06:50.562600 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:06:50.647556 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-c1021c0c-ccd9-4cb4-9858-aff3be757493 req-0cda8c40-0b08-49a6-8357-2cc0f18110c9 service nova] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Received event network-vif-plugged-d51f24bc-46f5-4a40-acd6-04ad86c95955 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:06:50.647991 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-c1021c0c-ccd9-4cb4-9858-aff3be757493 req-0cda8c40-0b08-49a6-8357-2cc0f18110c9 service nova] Acquiring lock "a9c92dff-0930-42ad-92e6-4be1bd8b360f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:06:50.648449 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-c1021c0c-ccd9-4cb4-9858-aff3be757493 req-0cda8c40-0b08-49a6-8357-2cc0f18110c9 service nova] Lock "a9c92dff-0930-42ad-92e6-4be1bd8b360f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:06:50.648730 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-c1021c0c-ccd9-4cb4-9858-aff3be757493 req-0cda8c40-0b08-49a6-8357-2cc0f18110c9 service nova] Lock "a9c92dff-0930-42ad-92e6-4be1bd8b360f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:06:50.649267 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-c1021c0c-ccd9-4cb4-9858-aff3be757493 req-0cda8c40-0b08-49a6-8357-2cc0f18110c9 service nova] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] No waiting events found dispatching network-vif-plugged-d51f24bc-46f5-4a40-acd6-04ad86c95955 {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:06:50.649692 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [req-c1021c0c-ccd9-4cb4-9858-aff3be757493 req-0cda8c40-0b08-49a6-8357-2cc0f18110c9 service nova] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Received unexpected event network-vif-plugged-d51f24bc-46f5-4a40-acd6-04ad86c95955 for instance with vm_state active and task_state None. Aug 30 14:06:50.788521 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Successfully created port: 4c54fc2c-8b42-4b34-8d82-ec7191453a48 {{(pid=107505) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} Aug 30 14:06:51.879649 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-1288fc28-a4c8-4ac4-9cae-9e8ef54c85c3 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Acquiring lock "a9c92dff-0930-42ad-92e6-4be1bd8b360f" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:06:51.880145 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-1288fc28-a4c8-4ac4-9cae-9e8ef54c85c3 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Lock "a9c92dff-0930-42ad-92e6-4be1bd8b360f" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:06:51.880578 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-1288fc28-a4c8-4ac4-9cae-9e8ef54c85c3 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Acquiring lock "a9c92dff-0930-42ad-92e6-4be1bd8b360f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:06:51.880979 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-1288fc28-a4c8-4ac4-9cae-9e8ef54c85c3 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Lock "a9c92dff-0930-42ad-92e6-4be1bd8b360f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:06:51.881351 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-1288fc28-a4c8-4ac4-9cae-9e8ef54c85c3 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Lock "a9c92dff-0930-42ad-92e6-4be1bd8b360f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:06:51.889048 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-1288fc28-a4c8-4ac4-9cae-9e8ef54c85c3 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Terminating instance Aug 30 14:06:51.893877 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-1288fc28-a4c8-4ac4-9cae-9e8ef54c85c3 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Start destroying the instance on the hypervisor. {{(pid=107505) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} Aug 30 14:06:51.897808 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Successfully updated port: 4c54fc2c-8b42-4b34-8d82-ec7191453a48 {{(pid=107505) _update_port /opt/stack/nova/nova/network/neutron.py:586}} Aug 30 14:06:51.913955 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Acquiring lock "refresh_cache-047619d3-96a4-42df-9e90-f87c5ce4ce1c" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:06:51.914193 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Acquired lock "refresh_cache-047619d3-96a4-42df-9e90-f87c5ce4ce1c" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:06:51.914471 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Building network info cache for instance {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} Aug 30 14:06:52.103611 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:52.117047 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:52.123907 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:52.153957 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Instance destroyed successfully. Aug 30 14:06:52.154672 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-1288fc28-a4c8-4ac4-9cae-9e8ef54c85c3 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Lazy-loading 'resources' on Instance uuid a9c92dff-0930-42ad-92e6-4be1bd8b360f {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:06:52.163211 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:52.169977 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-1288fc28-a4c8-4ac4-9cae-9e8ef54c85c3 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2023-08-30T14:06:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerDiagnosticsNegativeTest-server-221739488',display_name='tempest-ServerDiagnosticsNegativeTest-server-221739488',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-serverdiagnosticsnegativetest-server-221739488',id=4,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-08-30T14:06:49Z,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='bb4d3500105449b4a4ad968e9fc83533',ramdisk_id='',reservation_id='r-lna7krme',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerDiagnosticsNegativeTest-1918409125',owner_user_name='tempest-ServerDiagnosticsNegativeTest-1918409125-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-08-30T14:06:50Z,user_data=None,user_id='578503f281764af4a8b61eee23a5e34e',uuid=a9c92dff-0930-42ad-92e6-4be1bd8b360f,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d51f24bc-46f5-4a40-acd6-04ad86c95955", "address": "fa:16:3e:e0:4f:32", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.128", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd51f24bc-46", "ovs_interfaceid": "d51f24bc-46f5-4a40-acd6-04ad86c95955", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Aug 30 14:06:52.170462 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-1288fc28-a4c8-4ac4-9cae-9e8ef54c85c3 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Converting VIF {"id": "d51f24bc-46f5-4a40-acd6-04ad86c95955", "address": "fa:16:3e:e0:4f:32", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.128", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd51f24bc-46", "ovs_interfaceid": "d51f24bc-46f5-4a40-acd6-04ad86c95955", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:06:52.171739 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-1288fc28-a4c8-4ac4-9cae-9e8ef54c85c3 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e0:4f:32,bridge_name='br-int',has_traffic_filtering=True,id=d51f24bc-46f5-4a40-acd6-04ad86c95955,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd51f24bc-46') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:06:52.172412 np0035104604 nova-compute[107505]: DEBUG os_vif [None req-1288fc28-a4c8-4ac4-9cae-9e8ef54c85c3 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e0:4f:32,bridge_name='br-int',has_traffic_filtering=True,id=d51f24bc-46f5-4a40-acd6-04ad86c95955,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd51f24bc-46') {{(pid=107505) unplug /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:109}} Aug 30 14:06:52.175654 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:52.176067 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd51f24bc-46, bridge=br-int, if_exists=True) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:06:52.183978 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Instance cache missing network info. {{(pid=107505) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} Aug 30 14:06:52.314287 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:06:52.322670 np0035104604 nova-compute[107505]: INFO os_vif [None req-1288fc28-a4c8-4ac4-9cae-9e8ef54c85c3 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e0:4f:32,bridge_name='br-int',has_traffic_filtering=True,id=d51f24bc-46f5-4a40-acd6-04ad86c95955,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd51f24bc-46') Aug 30 14:06:52.705666 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-1288fc28-a4c8-4ac4-9cae-9e8ef54c85c3 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Deleting instance files /opt/stack/data/nova/instances/a9c92dff-0930-42ad-92e6-4be1bd8b360f_del Aug 30 14:06:52.706276 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-1288fc28-a4c8-4ac4-9cae-9e8ef54c85c3 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Deletion of /opt/stack/data/nova/instances/a9c92dff-0930-42ad-92e6-4be1bd8b360f_del complete Aug 30 14:06:52.714381 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-1c49d00c-173f-4374-9284-5b6d1f39779a req-50a7f60b-6ada-4246-be0e-67b30f639a9c service nova] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Received event network-changed-4c54fc2c-8b42-4b34-8d82-ec7191453a48 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:06:52.714601 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-1c49d00c-173f-4374-9284-5b6d1f39779a req-50a7f60b-6ada-4246-be0e-67b30f639a9c service nova] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Refreshing instance network info cache due to event network-changed-4c54fc2c-8b42-4b34-8d82-ec7191453a48. {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11037}} Aug 30 14:06:52.714879 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-1c49d00c-173f-4374-9284-5b6d1f39779a req-50a7f60b-6ada-4246-be0e-67b30f639a9c service nova] Acquiring lock "refresh_cache-047619d3-96a4-42df-9e90-f87c5ce4ce1c" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:06:52.907677 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:06:52.908261 np0035104604 nova-compute[107505]: INFO nova.compute.manager [-] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] VM Stopped (Lifecycle Event) Aug 30 14:06:52.920097 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-1288fc28-a4c8-4ac4-9cae-9e8ef54c85c3 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Took 1.03 seconds to destroy the instance on the hypervisor. Aug 30 14:06:52.921367 np0035104604 nova-compute[107505]: DEBUG oslo.service.loopingcall [None req-1288fc28-a4c8-4ac4-9cae-9e8ef54c85c3 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=107505) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} Aug 30 14:06:52.921741 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [-] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Deallocating network for instance {{(pid=107505) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} Aug 30 14:06:52.921981 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [-] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] deallocate_for_instance() {{(pid=107505) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} Aug 30 14:06:53.034735 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-8a7f9929-bf72-46c1-8f2d-55a5464db6fb None None] [instance: ad833d26-ac53-4df9-aaaf-daa2928d67f3] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:06:53.385103 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Updating instance_info_cache with network_info: [{"id": "4c54fc2c-8b42-4b34-8d82-ec7191453a48", "address": "fa:16:3e:50:3c:c0", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.204", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c54fc2c-8b", "ovs_interfaceid": "4c54fc2c-8b42-4b34-8d82-ec7191453a48", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:06:53.470653 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Releasing lock "refresh_cache-047619d3-96a4-42df-9e90-f87c5ce4ce1c" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:06:53.471397 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Instance network_info: |[{"id": "4c54fc2c-8b42-4b34-8d82-ec7191453a48", "address": "fa:16:3e:50:3c:c0", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.204", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c54fc2c-8b", "ovs_interfaceid": "4c54fc2c-8b42-4b34-8d82-ec7191453a48", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=107505) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} Aug 30 14:06:53.472756 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-1c49d00c-173f-4374-9284-5b6d1f39779a req-50a7f60b-6ada-4246-be0e-67b30f639a9c service nova] Acquired lock "refresh_cache-047619d3-96a4-42df-9e90-f87c5ce4ce1c" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:06:53.473533 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-1c49d00c-173f-4374-9284-5b6d1f39779a req-50a7f60b-6ada-4246-be0e-67b30f639a9c service nova] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Refreshing network info cache for port 4c54fc2c-8b42-4b34-8d82-ec7191453a48 {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} Aug 30 14:06:53.479634 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Start _get_guest_xml network_info=[{"id": "4c54fc2c-8b42-4b34-8d82-ec7191453a48", "address": "fa:16:3e:50:3c:c0", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.204", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c54fc2c-8b", "ovs_interfaceid": "4c54fc2c-8b42-4b34-8d82-ec7191453a48", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'ide', 'dev': 'hda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-08-30T14:01:32Z,direct_url=,disk_format='qcow2',id=a32d892d-d3ba-4b7e-9bab-7e06f730b9e0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='7f1ed4be4a61468fa9c662d2b5fbcb56',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-08-30T14:01:43Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'disk_bus': 'virtio', 'image_id': 'a32d892d-d3ba-4b7e-9bab-7e06f730b9e0'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=107505) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7553}} Aug 30 14:06:53.485525 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Aug 30 14:06:53.628065 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Searching host: 'np0035104604' for CPU controller through CGroups V1... {{(pid=107505) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1650}} Aug 30 14:06:53.628837 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] CPU controller missing on host. {{(pid=107505) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1660}} Aug 30 14:06:53.634710 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Searching host: 'np0035104604' for CPU controller through CGroups V2... {{(pid=107505) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1669}} Aug 30 14:06:53.635075 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] CPU controller found on host. {{(pid=107505) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1676}} Aug 30 14:06:53.636986 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=107505) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5390}} Aug 30 14:06:53.637873 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-08-30T14:02:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-08-30T14:01:32Z,direct_url=,disk_format='qcow2',id=a32d892d-d3ba-4b7e-9bab-7e06f730b9e0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='7f1ed4be4a61468fa9c662d2b5fbcb56',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-08-30T14:01:43Z,virtual_size=,visibility=), allow threads: True {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Aug 30 14:06:53.638534 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Flavor limits 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Aug 30 14:06:53.638954 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Image limits 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Aug 30 14:06:53.639989 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Flavor pref 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Aug 30 14:06:53.640837 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Image pref 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Aug 30 14:06:53.641053 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Aug 30 14:06:53.641622 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Aug 30 14:06:53.642078 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=107505) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Aug 30 14:06:53.642583 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Got 1 possible topologies {{(pid=107505) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Aug 30 14:06:53.643042 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Aug 30 14:06:53.643521 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Aug 30 14:06:53.664447 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:06:53.697057 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:53.889707 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [-] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Updating instance_info_cache with network_info: [] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:06:53.909778 np0035104604 nova-compute[107505]: INFO nova.compute.manager [-] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Took 0.99 seconds to deallocate network for instance. Aug 30 14:06:54.099687 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-1288fc28-a4c8-4ac4-9cae-9e8ef54c85c3 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:06:54.100059 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-1288fc28-a4c8-4ac4-9cae-9e8ef54c85c3 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:06:54.550841 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.887s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:06:54.574601 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] rbd image 047619d3-96a4-42df-9e90-f87c5ce4ce1c_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:06:54.583077 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:06:54.808217 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-a6a84256-ad38-4aad-bfcb-077e0d5011b1 req-2f3fd706-1f26-4f8d-a392-2e872121e9b1 service nova] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Received event network-vif-deleted-d51f24bc-46f5-4a40-acd6-04ad86c95955 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:06:55.213143 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.630s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:06:55.331638 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-08-30T14:06:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerExternalEventsTest-server-1778980424',display_name='tempest-ServerExternalEventsTest-server-1778980424',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-serverexternaleventstest-server-1778980424',id=5,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='741ccd49161b469dad4432ec34c48751',ramdisk_id='',reservation_id='r-7dsba880',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerExternalEventsTest-1558729978',owner_user_name='tempest-ServerExternalEventsTest-1558729978-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-08-30T14:06:49Z,user_data=None,user_id='cd5ee1e9a692476486157db4419d0f8b',uuid=047619d3-96a4-42df-9e90-f87c5ce4ce1c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4c54fc2c-8b42-4b34-8d82-ec7191453a48", "address": "fa:16:3e:50:3c:c0", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.204", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c54fc2c-8b", "ovs_interfaceid": "4c54fc2c-8b42-4b34-8d82-ec7191453a48", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=qemu {{(pid=107505) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Aug 30 14:06:55.332211 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Converting VIF {"id": "4c54fc2c-8b42-4b34-8d82-ec7191453a48", "address": "fa:16:3e:50:3c:c0", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.204", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c54fc2c-8b", "ovs_interfaceid": "4c54fc2c-8b42-4b34-8d82-ec7191453a48", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:06:55.334267 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:50:3c:c0,bridge_name='br-int',has_traffic_filtering=True,id=4c54fc2c-8b42-4b34-8d82-ec7191453a48,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c54fc2c-8b') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:06:55.336182 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Lazy-loading 'pci_devices' on Instance uuid 047619d3-96a4-42df-9e90-f87c5ce4ce1c {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] End _get_guest_xml xml= Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: 047619d3-96a4-42df-9e90-f87c5ce4ce1c Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: instance-00000005 Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: 131072 Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: 1 Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: tempest-ServerExternalEventsTest-server-1778980424 Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: 2023-08-30 14:06:53 Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: 128 Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: 1 Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: 0 Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: 0 Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: 1 Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: tempest-ServerExternalEventsTest-1558729978-project-member Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: tempest-ServerExternalEventsTest-1558729978 Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: OpenStack Foundation Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: OpenStack Nova Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: 27.1.0 Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: 047619d3-96a4-42df-9e90-f87c5ce4ce1c Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: 047619d3-96a4-42df-9e90-f87c5ce4ce1c Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Virtual Machine Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: hvm Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Nehalem Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: /dev/urandom Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: Aug 30 14:06:55.352555 np0035104604 nova-compute[107505]: {{(pid=107505) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7559}} Aug 30 14:06:55.360881 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Preparing to wait for external event network-vif-plugged-4c54fc2c-8b42-4b34-8d82-ec7191453a48 {{(pid=107505) prepare_for_instance_event /opt/stack/nova/nova/compute/manager.py:283}} Aug 30 14:06:55.360881 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Acquiring lock "047619d3-96a4-42df-9e90-f87c5ce4ce1c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:06:55.360881 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Lock "047619d3-96a4-42df-9e90-f87c5ce4ce1c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:06:55.360881 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Lock "047619d3-96a4-42df-9e90-f87c5ce4ce1c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:06:55.360881 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-08-30T14:06:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerExternalEventsTest-server-1778980424',display_name='tempest-ServerExternalEventsTest-server-1778980424',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-serverexternaleventstest-server-1778980424',id=5,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='741ccd49161b469dad4432ec34c48751',ramdisk_id='',reservation_id='r-7dsba880',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerExternalEventsTest-1558729978',owner_user_name='tempest-ServerExternalEventsTest-1558729978-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-08-30T14:06:49Z,user_data=None,user_id='cd5ee1e9a692476486157db4419d0f8b',uuid=047619d3-96a4-42df-9e90-f87c5ce4ce1c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4c54fc2c-8b42-4b34-8d82-ec7191453a48", "address": "fa:16:3e:50:3c:c0", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.204", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c54fc2c-8b", "ovs_interfaceid": "4c54fc2c-8b42-4b34-8d82-ec7191453a48", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Aug 30 14:06:55.361641 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Converting VIF {"id": "4c54fc2c-8b42-4b34-8d82-ec7191453a48", "address": "fa:16:3e:50:3c:c0", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.204", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c54fc2c-8b", "ovs_interfaceid": "4c54fc2c-8b42-4b34-8d82-ec7191453a48", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:06:55.361641 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:50:3c:c0,bridge_name='br-int',has_traffic_filtering=True,id=4c54fc2c-8b42-4b34-8d82-ec7191453a48,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c54fc2c-8b') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:06:55.361641 np0035104604 nova-compute[107505]: DEBUG os_vif [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:3c:c0,bridge_name='br-int',has_traffic_filtering=True,id=4c54fc2c-8b42-4b34-8d82-ec7191453a48,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c54fc2c-8b') {{(pid=107505) plug /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:76}} Aug 30 14:06:55.362609 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-1288fc28-a4c8-4ac4-9cae-9e8ef54c85c3 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:06:55.380862 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:55.382325 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:06:55.383356 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Aug 30 14:06:55.391261 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:55.391954 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4c54fc2c-8b, may_exist=True, interface_attrs={}) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:06:55.393025 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4c54fc2c-8b, col_values=(('external_ids', {'iface-id': '4c54fc2c-8b42-4b34-8d82-ec7191453a48', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:50:3c:c0', 'vm-uuid': '047619d3-96a4-42df-9e90-f87c5ce4ce1c'}),), if_exists=True) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:06:55.396678 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:55.403375 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:06:55.407011 np0035104604 nova-compute[107505]: INFO os_vif [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:3c:c0,bridge_name='br-int',has_traffic_filtering=True,id=4c54fc2c-8b42-4b34-8d82-ec7191453a48,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c54fc2c-8b') Aug 30 14:06:55.447441 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-1c49d00c-173f-4374-9284-5b6d1f39779a req-50a7f60b-6ada-4246-be0e-67b30f639a9c service nova] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Updated VIF entry in instance network info cache for port 4c54fc2c-8b42-4b34-8d82-ec7191453a48. {{(pid=107505) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} Aug 30 14:06:55.448304 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-1c49d00c-173f-4374-9284-5b6d1f39779a req-50a7f60b-6ada-4246-be0e-67b30f639a9c service nova] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Updating instance_info_cache with network_info: [{"id": "4c54fc2c-8b42-4b34-8d82-ec7191453a48", "address": "fa:16:3e:50:3c:c0", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.204", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c54fc2c-8b", "ovs_interfaceid": "4c54fc2c-8b42-4b34-8d82-ec7191453a48", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:06:55.458517 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] No BDM found with device name vda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:06:55.458915 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] No BDM found with device name hda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:06:55.459325 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] No VIF found with MAC fa:16:3e:50:3c:c0, not building metadata {{(pid=107505) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12068}} Aug 30 14:06:55.460208 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Using config drive Aug 30 14:06:55.492746 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] rbd image 047619d3-96a4-42df-9e90-f87c5ce4ce1c_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:06:55.499018 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-1c49d00c-173f-4374-9284-5b6d1f39779a req-50a7f60b-6ada-4246-be0e-67b30f639a9c service nova] Releasing lock "refresh_cache-047619d3-96a4-42df-9e90-f87c5ce4ce1c" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:06:55.499609 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-1c49d00c-173f-4374-9284-5b6d1f39779a req-50a7f60b-6ada-4246-be0e-67b30f639a9c service nova] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Received event network-vif-unplugged-d51f24bc-46f5-4a40-acd6-04ad86c95955 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:06:55.500084 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-1c49d00c-173f-4374-9284-5b6d1f39779a req-50a7f60b-6ada-4246-be0e-67b30f639a9c service nova] Acquiring lock "a9c92dff-0930-42ad-92e6-4be1bd8b360f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:06:55.500575 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-1c49d00c-173f-4374-9284-5b6d1f39779a req-50a7f60b-6ada-4246-be0e-67b30f639a9c service nova] Lock "a9c92dff-0930-42ad-92e6-4be1bd8b360f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:06:55.500954 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-1c49d00c-173f-4374-9284-5b6d1f39779a req-50a7f60b-6ada-4246-be0e-67b30f639a9c service nova] Lock "a9c92dff-0930-42ad-92e6-4be1bd8b360f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:06:55.501352 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-1c49d00c-173f-4374-9284-5b6d1f39779a req-50a7f60b-6ada-4246-be0e-67b30f639a9c service nova] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] No waiting events found dispatching network-vif-unplugged-d51f24bc-46f5-4a40-acd6-04ad86c95955 {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:06:55.501858 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-1c49d00c-173f-4374-9284-5b6d1f39779a req-50a7f60b-6ada-4246-be0e-67b30f639a9c service nova] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Received event network-vif-unplugged-d51f24bc-46f5-4a40-acd6-04ad86c95955 for instance with task_state deleting. {{(pid=107505) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10810}} Aug 30 14:06:55.502260 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-1c49d00c-173f-4374-9284-5b6d1f39779a req-50a7f60b-6ada-4246-be0e-67b30f639a9c service nova] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Received event network-vif-plugged-d51f24bc-46f5-4a40-acd6-04ad86c95955 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:06:55.502678 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-1c49d00c-173f-4374-9284-5b6d1f39779a req-50a7f60b-6ada-4246-be0e-67b30f639a9c service nova] Acquiring lock "a9c92dff-0930-42ad-92e6-4be1bd8b360f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:06:55.503162 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-1c49d00c-173f-4374-9284-5b6d1f39779a req-50a7f60b-6ada-4246-be0e-67b30f639a9c service nova] Lock "a9c92dff-0930-42ad-92e6-4be1bd8b360f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:06:55.504631 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-1c49d00c-173f-4374-9284-5b6d1f39779a req-50a7f60b-6ada-4246-be0e-67b30f639a9c service nova] Lock "a9c92dff-0930-42ad-92e6-4be1bd8b360f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:06:55.504631 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-1c49d00c-173f-4374-9284-5b6d1f39779a req-50a7f60b-6ada-4246-be0e-67b30f639a9c service nova] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] No waiting events found dispatching network-vif-plugged-d51f24bc-46f5-4a40-acd6-04ad86c95955 {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:06:55.505398 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [req-1c49d00c-173f-4374-9284-5b6d1f39779a req-50a7f60b-6ada-4246-be0e-67b30f639a9c service nova] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Received unexpected event network-vif-plugged-d51f24bc-46f5-4a40-acd6-04ad86c95955 for instance with vm_state active and task_state deleting. Aug 30 14:06:55.883074 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Creating config drive at /opt/stack/data/nova/instances/047619d3-96a4-42df-9e90-f87c5ce4ce1c/disk.config Aug 30 14:06:55.888707 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Running cmd (subprocess): genisoimage -o /opt/stack/data/nova/instances/047619d3-96a4-42df-9e90-f87c5ce4ce1c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 27.1.0 -quiet -J -r -V config-2 /tmp/tmp6vm1b9fx {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:06:55.918176 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] CMD "genisoimage -o /opt/stack/data/nova/instances/047619d3-96a4-42df-9e90-f87c5ce4ce1c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 27.1.0 -quiet -J -r -V config-2 /tmp/tmp6vm1b9fx" returned: 0 in 0.029s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:06:55.946778 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] rbd image 047619d3-96a4-42df-9e90-f87c5ce4ce1c_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:06:55.950247 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/047619d3-96a4-42df-9e90-f87c5ce4ce1c/disk.config 047619d3-96a4-42df-9e90-f87c5ce4ce1c_disk.config --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:06:56.051390 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-1288fc28-a4c8-4ac4-9cae-9e8ef54c85c3 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.689s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:06:56.057372 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-1288fc28-a4c8-4ac4-9cae-9e8ef54c85c3 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:06:56.077055 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-1288fc28-a4c8-4ac4-9cae-9e8ef54c85c3 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:06:56.120723 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-1288fc28-a4c8-4ac4-9cae-9e8ef54c85c3 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 2.020s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:06:56.241334 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] CMD "rbd import --pool vms /opt/stack/data/nova/instances/047619d3-96a4-42df-9e90-f87c5ce4ce1c/disk.config 047619d3-96a4-42df-9e90-f87c5ce4ce1c_disk.config --image-format=2 --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.291s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:06:56.241978 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Deleting local config drive /opt/stack/data/nova/instances/047619d3-96a4-42df-9e90-f87c5ce4ce1c/disk.config because it was imported into RBD. Aug 30 14:06:56.271683 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:56.282302 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:56.286970 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:56.290761 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:56.321189 np0035104604 nova-compute[107505]: INFO nova.scheduler.client.report [None req-1288fc28-a4c8-4ac4-9cae-9e8ef54c85c3 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Deleted allocations for instance a9c92dff-0930-42ad-92e6-4be1bd8b360f Aug 30 14:06:56.444579 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-1288fc28-a4c8-4ac4-9cae-9e8ef54c85c3 tempest-ServerDiagnosticsNegativeTest-1918409125 tempest-ServerDiagnosticsNegativeTest-1918409125-project-member] Lock "a9c92dff-0930-42ad-92e6-4be1bd8b360f" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 4.564s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:06:56.672754 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:56.687839 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:56.702164 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:56.845507 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-73aeafa1-28bc-49dc-8aa1-03c02315b04b req-f61ef528-f731-4600-be4f-344ce26c75b1 service nova] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Received event network-vif-plugged-4c54fc2c-8b42-4b34-8d82-ec7191453a48 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:06:56.846206 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-73aeafa1-28bc-49dc-8aa1-03c02315b04b req-f61ef528-f731-4600-be4f-344ce26c75b1 service nova] Acquiring lock "047619d3-96a4-42df-9e90-f87c5ce4ce1c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:06:56.846776 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-73aeafa1-28bc-49dc-8aa1-03c02315b04b req-f61ef528-f731-4600-be4f-344ce26c75b1 service nova] Lock "047619d3-96a4-42df-9e90-f87c5ce4ce1c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:06:56.847301 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-73aeafa1-28bc-49dc-8aa1-03c02315b04b req-f61ef528-f731-4600-be4f-344ce26c75b1 service nova] Lock "047619d3-96a4-42df-9e90-f87c5ce4ce1c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:06:56.847969 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-73aeafa1-28bc-49dc-8aa1-03c02315b04b req-f61ef528-f731-4600-be4f-344ce26c75b1 service nova] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Processing event network-vif-plugged-4c54fc2c-8b42-4b34-8d82-ec7191453a48 {{(pid=107505) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10792}} Aug 30 14:06:57.188717 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Instance event wait completed in 0 seconds for network-vif-plugged {{(pid=107505) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Aug 30 14:06:57.190013 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Started> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:06:57.190319 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] VM Started (Lifecycle Event) Aug 30 14:06:57.194523 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Guest created on hypervisor {{(pid=107505) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4411}} Aug 30 14:06:57.198142 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Instance spawned successfully. Aug 30 14:06:57.198702 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:894}} Aug 30 14:06:57.207442 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:06:57.214473 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:06:57.221120 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Found default for hw_cdrom_bus of ide {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:06:57.221530 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Found default for hw_disk_bus of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:06:57.222339 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Found default for hw_input_bus of None {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:06:57.222970 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Found default for hw_pointer_model of None {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:06:57.223666 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Found default for hw_video_model of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:06:57.224340 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Found default for hw_vif_model of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:06:57.240375 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] During sync_power_state the instance has a pending task (spawning). Skip. Aug 30 14:06:57.240851 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Paused> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:06:57.241190 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] VM Paused (Lifecycle Event) Aug 30 14:06:57.261010 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:06:57.266901 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Resumed> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:06:57.267347 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] VM Resumed (Lifecycle Event) Aug 30 14:06:57.312071 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:06:57.317240 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:06:57.327913 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Took 7.68 seconds to spawn the instance on the hypervisor. Aug 30 14:06:57.328712 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:06:57.342265 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] During sync_power_state the instance has a pending task (spawning). Skip. Aug 30 14:06:57.408730 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Took 10.15 seconds to build instance. Aug 30 14:06:57.430951 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-07f0b295-fcd3-4d43-a236-1794d358e034 tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Lock "047619d3-96a4-42df-9e90-f87c5ce4ce1c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 10.310s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:06:58.644863 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:58.919027 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-b9d8aff5-c5eb-432b-a567-730d229f1659 req-9076de18-da9f-4e0f-879a-dfa5fea4e255 service nova] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Received event network-vif-plugged-4c54fc2c-8b42-4b34-8d82-ec7191453a48 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:06:58.919440 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-b9d8aff5-c5eb-432b-a567-730d229f1659 req-9076de18-da9f-4e0f-879a-dfa5fea4e255 service nova] Acquiring lock "047619d3-96a4-42df-9e90-f87c5ce4ce1c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:06:58.919885 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-b9d8aff5-c5eb-432b-a567-730d229f1659 req-9076de18-da9f-4e0f-879a-dfa5fea4e255 service nova] Lock "047619d3-96a4-42df-9e90-f87c5ce4ce1c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:06:58.920291 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-b9d8aff5-c5eb-432b-a567-730d229f1659 req-9076de18-da9f-4e0f-879a-dfa5fea4e255 service nova] Lock "047619d3-96a4-42df-9e90-f87c5ce4ce1c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:06:58.921081 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-b9d8aff5-c5eb-432b-a567-730d229f1659 req-9076de18-da9f-4e0f-879a-dfa5fea4e255 service nova] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] No waiting events found dispatching network-vif-plugged-4c54fc2c-8b42-4b34-8d82-ec7191453a48 {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:06:58.921517 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [req-b9d8aff5-c5eb-432b-a567-730d229f1659 req-9076de18-da9f-4e0f-879a-dfa5fea4e255 service nova] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Received unexpected event network-vif-plugged-4c54fc2c-8b42-4b34-8d82-ec7191453a48 for instance with vm_state active and task_state None. Aug 30 14:06:59.484519 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-f2e4ccea-acb6-42c9-863f-1c500598bb91 tempest-ServerExternalEventsTest-14706527 tempest-ServerExternalEventsTest-14706527-project] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Received event network-changed {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:06:59.485001 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-f2e4ccea-acb6-42c9-863f-1c500598bb91 tempest-ServerExternalEventsTest-14706527 tempest-ServerExternalEventsTest-14706527-project] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Refreshing instance network info cache due to event network-changed. {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11037}} Aug 30 14:06:59.485644 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-f2e4ccea-acb6-42c9-863f-1c500598bb91 tempest-ServerExternalEventsTest-14706527 tempest-ServerExternalEventsTest-14706527-project] Acquiring lock "refresh_cache-047619d3-96a4-42df-9e90-f87c5ce4ce1c" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:06:59.486306 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-f2e4ccea-acb6-42c9-863f-1c500598bb91 tempest-ServerExternalEventsTest-14706527 tempest-ServerExternalEventsTest-14706527-project] Acquired lock "refresh_cache-047619d3-96a4-42df-9e90-f87c5ce4ce1c" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:06:59.486502 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-f2e4ccea-acb6-42c9-863f-1c500598bb91 tempest-ServerExternalEventsTest-14706527 tempest-ServerExternalEventsTest-14706527-project] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Building network info cache for instance {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} Aug 30 14:06:59.863025 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-3b133569-5084-4653-bb60-d344ee1cef7f tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Acquiring lock "047619d3-96a4-42df-9e90-f87c5ce4ce1c" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:06:59.863705 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-3b133569-5084-4653-bb60-d344ee1cef7f tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Lock "047619d3-96a4-42df-9e90-f87c5ce4ce1c" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:06:59.863995 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-3b133569-5084-4653-bb60-d344ee1cef7f tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Acquiring lock "047619d3-96a4-42df-9e90-f87c5ce4ce1c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:06:59.864549 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-3b133569-5084-4653-bb60-d344ee1cef7f tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Lock "047619d3-96a4-42df-9e90-f87c5ce4ce1c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:06:59.864957 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-3b133569-5084-4653-bb60-d344ee1cef7f tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Lock "047619d3-96a4-42df-9e90-f87c5ce4ce1c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:06:59.873148 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-3b133569-5084-4653-bb60-d344ee1cef7f tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Terminating instance Aug 30 14:06:59.876009 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-3b133569-5084-4653-bb60-d344ee1cef7f tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Start destroying the instance on the hypervisor. {{(pid=107505) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} Aug 30 14:06:59.926916 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:59.939805 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:06:59.971659 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:00.150139 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:00.159393 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Instance destroyed successfully. Aug 30 14:07:00.159919 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-3b133569-5084-4653-bb60-d344ee1cef7f tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Lazy-loading 'resources' on Instance uuid 047619d3-96a4-42df-9e90-f87c5ce4ce1c {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:07:00.179767 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-3b133569-5084-4653-bb60-d344ee1cef7f tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2023-08-30T14:06:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerExternalEventsTest-server-1778980424',display_name='tempest-ServerExternalEventsTest-server-1778980424',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-serverexternaleventstest-server-1778980424',id=5,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-08-30T14:06:57Z,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='741ccd49161b469dad4432ec34c48751',ramdisk_id='',reservation_id='r-7dsba880',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerExternalEventsTest-1558729978',owner_user_name='tempest-ServerExternalEventsTest-1558729978-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-08-30T14:06:57Z,user_data=None,user_id='cd5ee1e9a692476486157db4419d0f8b',uuid=047619d3-96a4-42df-9e90-f87c5ce4ce1c,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4c54fc2c-8b42-4b34-8d82-ec7191453a48", "address": "fa:16:3e:50:3c:c0", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.204", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c54fc2c-8b", "ovs_interfaceid": "4c54fc2c-8b42-4b34-8d82-ec7191453a48", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Aug 30 14:07:00.180253 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-3b133569-5084-4653-bb60-d344ee1cef7f tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Converting VIF {"id": "4c54fc2c-8b42-4b34-8d82-ec7191453a48", "address": "fa:16:3e:50:3c:c0", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.204", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c54fc2c-8b", "ovs_interfaceid": "4c54fc2c-8b42-4b34-8d82-ec7191453a48", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:07:00.181900 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-3b133569-5084-4653-bb60-d344ee1cef7f tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:50:3c:c0,bridge_name='br-int',has_traffic_filtering=True,id=4c54fc2c-8b42-4b34-8d82-ec7191453a48,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c54fc2c-8b') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:07:00.182614 np0035104604 nova-compute[107505]: DEBUG os_vif [None req-3b133569-5084-4653-bb60-d344ee1cef7f tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:3c:c0,bridge_name='br-int',has_traffic_filtering=True,id=4c54fc2c-8b42-4b34-8d82-ec7191453a48,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c54fc2c-8b') {{(pid=107505) unplug /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:109}} Aug 30 14:07:00.187060 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:00.187631 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4c54fc2c-8b, bridge=br-int, if_exists=True) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:07:00.195483 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:07:00.199466 np0035104604 nova-compute[107505]: INFO os_vif [None req-3b133569-5084-4653-bb60-d344ee1cef7f tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:3c:c0,bridge_name='br-int',has_traffic_filtering=True,id=4c54fc2c-8b42-4b34-8d82-ec7191453a48,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c54fc2c-8b') Aug 30 14:07:00.744160 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-3b133569-5084-4653-bb60-d344ee1cef7f tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Deleting instance files /opt/stack/data/nova/instances/047619d3-96a4-42df-9e90-f87c5ce4ce1c_del Aug 30 14:07:00.744517 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-3b133569-5084-4653-bb60-d344ee1cef7f tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Deletion of /opt/stack/data/nova/instances/047619d3-96a4-42df-9e90-f87c5ce4ce1c_del complete Aug 30 14:07:00.809761 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-3b133569-5084-4653-bb60-d344ee1cef7f tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Took 0.93 seconds to destroy the instance on the hypervisor. Aug 30 14:07:00.810404 np0035104604 nova-compute[107505]: DEBUG oslo.service.loopingcall [None req-3b133569-5084-4653-bb60-d344ee1cef7f tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=107505) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} Aug 30 14:07:00.810804 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [-] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Deallocating network for instance {{(pid=107505) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} Aug 30 14:07:00.811027 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [-] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] deallocate_for_instance() {{(pid=107505) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} Aug 30 14:07:01.083738 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-761acaf8-6376-4791-bbef-006d6f6839a1 req-e1cda1fc-cc56-48e9-8b2b-9be7cdea7c6c service nova] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Received event network-vif-unplugged-4c54fc2c-8b42-4b34-8d82-ec7191453a48 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:07:01.084598 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-761acaf8-6376-4791-bbef-006d6f6839a1 req-e1cda1fc-cc56-48e9-8b2b-9be7cdea7c6c service nova] Acquiring lock "047619d3-96a4-42df-9e90-f87c5ce4ce1c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:01.084598 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-761acaf8-6376-4791-bbef-006d6f6839a1 req-e1cda1fc-cc56-48e9-8b2b-9be7cdea7c6c service nova] Lock "047619d3-96a4-42df-9e90-f87c5ce4ce1c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:01.084598 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-761acaf8-6376-4791-bbef-006d6f6839a1 req-e1cda1fc-cc56-48e9-8b2b-9be7cdea7c6c service nova] Lock "047619d3-96a4-42df-9e90-f87c5ce4ce1c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:01.085145 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-761acaf8-6376-4791-bbef-006d6f6839a1 req-e1cda1fc-cc56-48e9-8b2b-9be7cdea7c6c service nova] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] No waiting events found dispatching network-vif-unplugged-4c54fc2c-8b42-4b34-8d82-ec7191453a48 {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:07:01.085145 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-761acaf8-6376-4791-bbef-006d6f6839a1 req-e1cda1fc-cc56-48e9-8b2b-9be7cdea7c6c service nova] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Received event network-vif-unplugged-4c54fc2c-8b42-4b34-8d82-ec7191453a48 for instance with task_state deleting. {{(pid=107505) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10810}} Aug 30 14:07:01.085439 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-761acaf8-6376-4791-bbef-006d6f6839a1 req-e1cda1fc-cc56-48e9-8b2b-9be7cdea7c6c service nova] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Received event network-vif-plugged-4c54fc2c-8b42-4b34-8d82-ec7191453a48 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:07:01.085439 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-761acaf8-6376-4791-bbef-006d6f6839a1 req-e1cda1fc-cc56-48e9-8b2b-9be7cdea7c6c service nova] Acquiring lock "047619d3-96a4-42df-9e90-f87c5ce4ce1c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:01.085816 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-761acaf8-6376-4791-bbef-006d6f6839a1 req-e1cda1fc-cc56-48e9-8b2b-9be7cdea7c6c service nova] Lock "047619d3-96a4-42df-9e90-f87c5ce4ce1c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:01.086005 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-761acaf8-6376-4791-bbef-006d6f6839a1 req-e1cda1fc-cc56-48e9-8b2b-9be7cdea7c6c service nova] Lock "047619d3-96a4-42df-9e90-f87c5ce4ce1c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:01.086251 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-761acaf8-6376-4791-bbef-006d6f6839a1 req-e1cda1fc-cc56-48e9-8b2b-9be7cdea7c6c service nova] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] No waiting events found dispatching network-vif-plugged-4c54fc2c-8b42-4b34-8d82-ec7191453a48 {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:07:01.086507 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [req-761acaf8-6376-4791-bbef-006d6f6839a1 req-e1cda1fc-cc56-48e9-8b2b-9be7cdea7c6c service nova] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Received unexpected event network-vif-plugged-4c54fc2c-8b42-4b34-8d82-ec7191453a48 for instance with vm_state active and task_state deleting. Aug 30 14:07:01.257924 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-f2e4ccea-acb6-42c9-863f-1c500598bb91 tempest-ServerExternalEventsTest-14706527 tempest-ServerExternalEventsTest-14706527-project] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Updating instance_info_cache with network_info: [{"id": "4c54fc2c-8b42-4b34-8d82-ec7191453a48", "address": "fa:16:3e:50:3c:c0", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.204", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c54fc2c-8b", "ovs_interfaceid": "4c54fc2c-8b42-4b34-8d82-ec7191453a48", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:07:01.279864 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-f2e4ccea-acb6-42c9-863f-1c500598bb91 tempest-ServerExternalEventsTest-14706527 tempest-ServerExternalEventsTest-14706527-project] Releasing lock "refresh_cache-047619d3-96a4-42df-9e90-f87c5ce4ce1c" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:07:01.721417 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [-] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Updating instance_info_cache with network_info: [] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:07:01.740978 np0035104604 nova-compute[107505]: INFO nova.compute.manager [-] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Took 0.93 seconds to deallocate network for instance. Aug 30 14:07:01.781793 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-7a6286fb-08ae-46cd-9042-e9c86d72ee4d req-810783d6-7f8a-4aa5-8d4f-35dc9ae27259 service nova] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Received event network-vif-deleted-4c54fc2c-8b42-4b34-8d82-ec7191453a48 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:07:01.819071 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-3b133569-5084-4653-bb60-d344ee1cef7f tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:01.819936 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-3b133569-5084-4653-bb60-d344ee1cef7f tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:02.240835 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Acquiring lock "bfa80c86-477c-4e04-96e3-d7fee174f726" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:02.241373 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Lock "bfa80c86-477c-4e04-96e3-d7fee174f726" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:02.354469 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Starting instance... {{(pid=107505) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} Aug 30 14:07:02.381990 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-3b133569-5084-4653-bb60-d344ee1cef7f tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:02.559810 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:03.041241 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-3b133569-5084-4653-bb60-d344ee1cef7f tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.659s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:03.047349 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-3b133569-5084-4653-bb60-d344ee1cef7f tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:07:03.061359 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-3b133569-5084-4653-bb60-d344ee1cef7f tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:07:03.100424 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-3b133569-5084-4653-bb60-d344ee1cef7f tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.281s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:03.209864 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.651s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:03.218132 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=107505) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Aug 30 14:07:03.218421 np0035104604 nova-compute[107505]: INFO nova.compute.claims [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Claim successful on node np0035104604 Aug 30 14:07:03.248009 np0035104604 nova-compute[107505]: INFO nova.scheduler.client.report [None req-3b133569-5084-4653-bb60-d344ee1cef7f tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Deleted allocations for instance 047619d3-96a4-42df-9e90-f87c5ce4ce1c Aug 30 14:07:03.442645 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-3b133569-5084-4653-bb60-d344ee1cef7f tempest-ServerExternalEventsTest-1558729978 tempest-ServerExternalEventsTest-1558729978-project-member] Lock "047619d3-96a4-42df-9e90-f87c5ce4ce1c" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 3.579s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:03.748126 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:04.406472 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.658s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:04.413351 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:07:04.430170 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:07:04.467798 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.258s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:04.468871 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Start building networks asynchronously for instance. {{(pid=107505) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} Aug 30 14:07:04.545130 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Not allocating networking since 'none' was specified. {{(pid=107505) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} Aug 30 14:07:04.568541 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Aug 30 14:07:04.586134 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Start building block device mappings for instance. {{(pid=107505) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} Aug 30 14:07:04.630010 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-d1bfceaf-0235-49df-a72e-f936df1dbf21 tempest-VolumesAssistedSnapshotsTest-2065217557 tempest-VolumesAssistedSnapshotsTest-2065217557-project-admin] Acquiring lock "a9a74ed7-6eb9-4167-86a3-fe188c08af97" by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:04.630416 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-d1bfceaf-0235-49df-a72e-f936df1dbf21 tempest-VolumesAssistedSnapshotsTest-2065217557 tempest-VolumesAssistedSnapshotsTest-2065217557-project-admin] Lock "a9a74ed7-6eb9-4167-86a3-fe188c08af97" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:04.646784 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-d1bfceaf-0235-49df-a72e-f936df1dbf21 tempest-VolumesAssistedSnapshotsTest-2065217557 tempest-VolumesAssistedSnapshotsTest-2065217557-project-admin] Lazy-loading 'flavor' on Instance uuid a9a74ed7-6eb9-4167-86a3-fe188c08af97 {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:07:04.721701 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-d1bfceaf-0235-49df-a72e-f936df1dbf21 tempest-VolumesAssistedSnapshotsTest-2065217557 tempest-VolumesAssistedSnapshotsTest-2065217557-project-admin] Lock "a9a74ed7-6eb9-4167-86a3-fe188c08af97" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" :: held 0.091s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:04.733448 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Start spawning the instance on the hypervisor. {{(pid=107505) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} Aug 30 14:07:04.734646 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Creating instance directory {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4717}} Aug 30 14:07:04.735199 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Creating image(s) Aug 30 14:07:04.762354 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] rbd image bfa80c86-477c-4e04-96e3-d7fee174f726_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:07:04.799226 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] rbd image bfa80c86-477c-4e04-96e3-d7fee174f726_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:07:04.831131 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] rbd image bfa80c86-477c-4e04-96e3-d7fee174f726_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:07:04.836070 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e --force-share --output=json {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:04.935715 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] CMD "/opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e --force-share --output=json" returned: 0 in 0.099s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:04.936889 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Acquiring lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:04.938326 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:04.939117 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:04.974105 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] rbd image bfa80c86-477c-4e04-96e3-d7fee174f726_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:07:04.979074 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e bfa80c86-477c-4e04-96e3-d7fee174f726_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:05.025225 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager.update_available_resource {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:07:05.037340 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-d1bfceaf-0235-49df-a72e-f936df1dbf21 tempest-VolumesAssistedSnapshotsTest-2065217557 tempest-VolumesAssistedSnapshotsTest-2065217557-project-admin] Acquiring lock "a9a74ed7-6eb9-4167-86a3-fe188c08af97" by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:05.038725 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-d1bfceaf-0235-49df-a72e-f936df1dbf21 tempest-VolumesAssistedSnapshotsTest-2065217557 tempest-VolumesAssistedSnapshotsTest-2065217557-project-admin] Lock "a9a74ed7-6eb9-4167-86a3-fe188c08af97" acquired by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" :: waited 0.002s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:05.039532 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-d1bfceaf-0235-49df-a72e-f936df1dbf21 tempest-VolumesAssistedSnapshotsTest-2065217557 tempest-VolumesAssistedSnapshotsTest-2065217557-project-admin] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Attaching volume a84ce1cc-f3b7-4961-9bff-72cc724ba754 to /dev/vdb Aug 30 14:07:05.065234 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:05.065234 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:05.065775 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:05.066287 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Auditing locally available compute resources for np0035104604 (node: np0035104604) {{(pid=107505) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} Aug 30 14:07:05.066868 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:05.192917 np0035104604 nova-compute[107505]: DEBUG os_brick.utils [None req-d1bfceaf-0235-49df-a72e-f936df1dbf21 tempest-VolumesAssistedSnapshotsTest-2065217557 tempest-VolumesAssistedSnapshotsTest-2065217557-project-admin] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '149.202.177.86', 'multipath': False, 'enforce_multipath': True, 'host': 'np0035104604', 'execute': None}" {{(pid=107505) trace_logging_wrapper /opt/stack/data/venv/lib/python3.10/site-packages/os_brick/utils.py:175}} Aug 30 14:07:05.200477 np0035104604 nova-compute[107505]: INFO oslo.privsep.daemon [None req-d1bfceaf-0235-49df-a72e-f936df1dbf21 tempest-VolumesAssistedSnapshotsTest-2065217557 tempest-VolumesAssistedSnapshotsTest-2065217557-project-admin] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova-cpu.conf', '--privsep_context', 'os_brick.privileged.default', '--privsep_sock_path', '/tmp/tmp9vsonc3s/privsep.sock'] Aug 30 14:07:05.208955 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:05.237242 np0035104604 sudo[120719]: stack : PWD=/ ; USER=root ; COMMAND=/opt/stack/data/venv/bin/nova-rootwrap /etc/nova/rootwrap.conf privsep-helper --config-file /etc/nova/nova-cpu.conf --privsep_context os_brick.privileged.default --privsep_sock_path /tmp/tmp9vsonc3s/privsep.sock Aug 30 14:07:05.237967 np0035104604 sudo[120719]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1001) Aug 30 14:07:05.256996 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] CMD "rbd import --pool vms /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e bfa80c86-477c-4e04-96e3-d7fee174f726_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.278s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:05.370390 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] resizing rbd image bfa80c86-477c-4e04-96e3-d7fee174f726_disk to 1073741824 {{(pid=107505) resize /opt/stack/nova/nova/storage/rbd_utils.py:288}} Aug 30 14:07:05.469106 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Created local disks {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4851}} Aug 30 14:07:05.469639 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Ensure instance console log exists: /opt/stack/data/nova/instances/bfa80c86-477c-4e04-96e3-d7fee174f726/console.log {{(pid=107505) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4603}} Aug 30 14:07:05.470289 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:05.470954 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:05.471313 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:05.473946 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'ide', 'dev': 'hda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-08-30T14:01:32Z,direct_url=,disk_format='qcow2',id=a32d892d-d3ba-4b7e-9bab-7e06f730b9e0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='7f1ed4be4a61468fa9c662d2b5fbcb56',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-08-30T14:01:43Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'disk_bus': 'virtio', 'image_id': 'a32d892d-d3ba-4b7e-9bab-7e06f730b9e0'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=107505) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7553}} Aug 30 14:07:05.478332 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Aug 30 14:07:05.481268 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Searching host: 'np0035104604' for CPU controller through CGroups V1... {{(pid=107505) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1650}} Aug 30 14:07:05.482022 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] CPU controller missing on host. {{(pid=107505) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1660}} Aug 30 14:07:05.483529 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Searching host: 'np0035104604' for CPU controller through CGroups V2... {{(pid=107505) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1669}} Aug 30 14:07:05.483871 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] CPU controller found on host. {{(pid=107505) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1676}} Aug 30 14:07:05.485110 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=107505) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5390}} Aug 30 14:07:05.485840 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-08-30T14:02:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-08-30T14:01:32Z,direct_url=,disk_format='qcow2',id=a32d892d-d3ba-4b7e-9bab-7e06f730b9e0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='7f1ed4be4a61468fa9c662d2b5fbcb56',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-08-30T14:01:43Z,virtual_size=,visibility=), allow threads: True {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Aug 30 14:07:05.486175 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Flavor limits 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Aug 30 14:07:05.486450 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Image limits 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Aug 30 14:07:05.486760 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Flavor pref 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Aug 30 14:07:05.487032 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Image pref 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Aug 30 14:07:05.487433 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Aug 30 14:07:05.487793 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Aug 30 14:07:05.488080 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=107505) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Aug 30 14:07:05.488907 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Got 1 possible topologies {{(pid=107505) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Aug 30 14:07:05.489228 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Aug 30 14:07:05.489568 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Aug 30 14:07:05.502848 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:05.760299 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.693s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:05.854452 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000001 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:07:05.854871 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000001 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:07:05.942478 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Aug 30 14:07:05.944645 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Hypervisor/Node resource view: name=np0035104604 free_ram=854MB free_disk=29.950210571289062GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "00b8", "vendor_id": "1013", "numa_node": null, "label": "label_1013_00b8", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] {{(pid=107505) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} Aug 30 14:07:05.944952 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:05.945269 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:06.175491 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance a9a74ed7-6eb9-4167-86a3-fe188c08af97 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:07:06.175491 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance bfa80c86-477c-4e04-96e3-d7fee174f726 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:07:06.176256 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Total usable vcpus: 8, total allocated vcpus: 2 {{(pid=107505) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} Aug 30 14:07:06.177388 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Final resource view: name=np0035104604 phys_ram=7746MB used_ram=768MB phys_disk=29GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] {{(pid=107505) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} Aug 30 14:07:06.346472 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.844s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:06.373881 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] rbd image bfa80c86-477c-4e04-96e3-d7fee174f726_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:07:06.383017 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:06.777423 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:06.964149 np0035104604 sudo[120719]: pam_unix(sudo:session): session closed for user root Aug 30 14:07:06.976494 np0035104604 nova-compute[107505]: INFO oslo.privsep.daemon [None req-d1bfceaf-0235-49df-a72e-f936df1dbf21 tempest-VolumesAssistedSnapshotsTest-2065217557 tempest-VolumesAssistedSnapshotsTest-2065217557-project-admin] Spawned new privsep daemon via rootwrap Aug 30 14:07:06.978446 np0035104604 nova-compute[107505]: INFO oslo.privsep.daemon [-] privsep daemon starting Aug 30 14:07:06.978712 np0035104604 nova-compute[107505]: INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0 Aug 30 14:07:06.978969 np0035104604 nova-compute[107505]: INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none Aug 30 14:07:06.979216 np0035104604 nova-compute[107505]: INFO oslo.privsep.daemon [-] privsep daemon running as pid 120875 Aug 30 14:07:06.983600 np0035104604 nova-compute[107505]: DEBUG oslo.privsep.daemon [-] privsep: reply[f2d2650e-39fd-4645-90be-bccbfc0c5a79]: (2,) {{(pid=120875) _call_back /opt/stack/data/venv/lib/python3.10/site-packages/oslo_privsep/daemon.py:499}} Aug 30 14:07:07.094199 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.711s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:07.096331 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Lazy-loading 'pci_devices' on Instance uuid bfa80c86-477c-4e04-96e3-d7fee174f726 {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] End _get_guest_xml xml= Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: bfa80c86-477c-4e04-96e3-d7fee174f726 Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: instance-00000006 Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: 131072 Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: 1 Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: tempest-ServersAdmin275Test-server-1173805866 Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: 2023-08-30 14:07:05 Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: 128 Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: 1 Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: 0 Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: 0 Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: 1 Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: tempest-ServersAdmin275Test-1203166925-project-member Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: tempest-ServersAdmin275Test-1203166925 Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: OpenStack Foundation Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: OpenStack Nova Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: 27.1.0 Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: bfa80c86-477c-4e04-96e3-d7fee174f726 Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: bfa80c86-477c-4e04-96e3-d7fee174f726 Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Virtual Machine Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: hvm Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Nehalem Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: /dev/urandom Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: Aug 30 14:07:07.111780 np0035104604 nova-compute[107505]: {{(pid=107505) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7559}} Aug 30 14:07:07.143738 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi {{(pid=120875) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:07.150211 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:07:07.150839 np0035104604 nova-compute[107505]: INFO nova.compute.manager [-] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] VM Stopped (Lifecycle Event) Aug 30 14:07:07.153507 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.010s {{(pid=120875) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:07.153810 np0035104604 nova-compute[107505]: DEBUG oslo.privsep.daemon [-] privsep: reply[c640c57d-acf2-4710-9155-78ffcd8baf72]: (4, ('InitiatorName=iqn.2016-04.com.open-iscsi:4918d4d097bf\n', '')) {{(pid=120875) _call_back /opt/stack/data/venv/lib/python3.10/site-packages/oslo_privsep/daemon.py:499}} Aug 30 14:07:07.157209 np0035104604 nova-compute[107505]: WARNING os_brick.initiator.connectors.nvmeof [None req-d1bfceaf-0235-49df-a72e-f936df1dbf21 tempest-VolumesAssistedSnapshotsTest-2065217557 tempest-VolumesAssistedSnapshotsTest-2065217557-project-admin] Could not find nvme_core/parameters/multipath: FileNotFoundError: [Errno 2] No such file or directory: '/sys/module/nvme_core/parameters/multipath' Aug 30 14:07:07.158209 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt / -n -o SOURCE {{(pid=120875) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:07.163449 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] No BDM found with device name vda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:07:07.163449 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] No BDM found with device name hda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:07:07.164324 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Using config drive Aug 30 14:07:07.203533 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] rbd image bfa80c86-477c-4e04-96e3-d7fee174f726_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:07:07.210342 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [-] CMD "findmnt / -n -o SOURCE" returned: 0 in 0.010s {{(pid=120875) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:07.210591 np0035104604 nova-compute[107505]: DEBUG oslo.privsep.daemon [-] privsep: reply[ef92f345-d7fd-48b7-a3f3-3aabc0d2a767]: (4, ('/dev/vda1\n', '')) {{(pid=120875) _call_back /opt/stack/data/venv/lib/python3.10/site-packages/oslo_privsep/daemon.py:499}} Aug 30 14:07:07.213651 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-dd01a33b-ca93-4628-8b6f-6d90b68f6bd3 None None] [instance: a9c92dff-0930-42ad-92e6-4be1bd8b360f] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:07:07.214977 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): blkid /dev/vda1 -s UUID -o value {{(pid=120875) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:07.342266 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [-] CMD "blkid /dev/vda1 -s UUID -o value" returned: 0 in 0.024s {{(pid=120875) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:07.342625 np0035104604 nova-compute[107505]: DEBUG oslo.privsep.daemon [-] privsep: reply[b1a97c2f-8c44-4a1d-9fce-5153d2e45584]: (4, ('2efee5ce-54eb-4251-8b91-cb783587f333\n', '')) {{(pid=120875) _call_back /opt/stack/data/venv/lib/python3.10/site-packages/oslo_privsep/daemon.py:499}} Aug 30 14:07:07.348790 np0035104604 nova-compute[107505]: DEBUG oslo.privsep.daemon [-] privsep: reply[b959ce03-3057-4028-bf6d-95aeb4020e2d]: (4, 'ed1e0f24-f0a1-46a6-80a0-067b49fc4977') {{(pid=120875) _call_back /opt/stack/data/venv/lib/python3.10/site-packages/oslo_privsep/daemon.py:499}} Aug 30 14:07:07.349370 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-d1bfceaf-0235-49df-a72e-f936df1dbf21 tempest-VolumesAssistedSnapshotsTest-2065217557 tempest-VolumesAssistedSnapshotsTest-2065217557-project-admin] Running cmd (subprocess): nvme version {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:07.364694 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-d1bfceaf-0235-49df-a72e-f936df1dbf21 tempest-VolumesAssistedSnapshotsTest-2065217557 tempest-VolumesAssistedSnapshotsTest-2065217557-project-admin] 'nvme version' failed. Not Retrying. {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:473}} Aug 30 14:07:07.366477 np0035104604 nova-compute[107505]: DEBUG os_brick.initiator.connectors.nvmeof [None req-d1bfceaf-0235-49df-a72e-f936df1dbf21 tempest-VolumesAssistedSnapshotsTest-2065217557 tempest-VolumesAssistedSnapshotsTest-2065217557-project-admin] nvme not present on system {{(pid=107505) nvme_present /opt/stack/data/venv/lib/python3.10/site-packages/os_brick/initiator/connectors/nvmeof.py:757}} Aug 30 14:07:07.370770 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): nvme show-hostnqn {{(pid=120875) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:07.377825 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Creating config drive at /opt/stack/data/nova/instances/bfa80c86-477c-4e04-96e3-d7fee174f726/disk.config Aug 30 14:07:07.381469 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Running cmd (subprocess): genisoimage -o /opt/stack/data/nova/instances/bfa80c86-477c-4e04-96e3-d7fee174f726/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 27.1.0 -quiet -J -r -V config-2 /tmp/tmpxs6tbre6 {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:07.398982 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [-] 'nvme show-hostnqn' failed. Not Retrying. {{(pid=120875) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:473}} Aug 30 14:07:07.399250 np0035104604 nova-compute[107505]: WARNING os_brick.privileged.nvmeof [-] Could not generate host nqn: [Errno 2] No such file or directory: 'nvme' Aug 30 14:07:07.399459 np0035104604 nova-compute[107505]: DEBUG oslo.privsep.daemon [-] privsep: reply[4cc187ff-fe34-4d2e-9ef1-aa0b057b6bd9]: (4, '') {{(pid=120875) _call_back /opt/stack/data/venv/lib/python3.10/site-packages/oslo_privsep/daemon.py:499}} Aug 30 14:07:07.403813 np0035104604 nova-compute[107505]: DEBUG os_brick.initiator.connectors.lightos [None req-d1bfceaf-0235-49df-a72e-f936df1dbf21 tempest-VolumesAssistedSnapshotsTest-2065217557 tempest-VolumesAssistedSnapshotsTest-2065217557-project-admin] LIGHTOS: [Errno 111] ECONNREFUSED {{(pid=107505) find_dsc /opt/stack/data/venv/lib/python3.10/site-packages/os_brick/initiator/connectors/lightos.py:98}} Aug 30 14:07:07.404090 np0035104604 nova-compute[107505]: DEBUG os_brick.initiator.connectors.lightos [None req-d1bfceaf-0235-49df-a72e-f936df1dbf21 tempest-VolumesAssistedSnapshotsTest-2065217557 tempest-VolumesAssistedSnapshotsTest-2065217557-project-admin] LIGHTOS: did not find dsc, continuing anyway. {{(pid=107505) get_connector_properties /opt/stack/data/venv/lib/python3.10/site-packages/os_brick/initiator/connectors/lightos.py:76}} Aug 30 14:07:07.404921 np0035104604 nova-compute[107505]: DEBUG os_brick.initiator.connectors.lightos [None req-d1bfceaf-0235-49df-a72e-f936df1dbf21 tempest-VolumesAssistedSnapshotsTest-2065217557 tempest-VolumesAssistedSnapshotsTest-2065217557-project-admin] LIGHTOS: no hostnqn found. {{(pid=107505) get_connector_properties /opt/stack/data/venv/lib/python3.10/site-packages/os_brick/initiator/connectors/lightos.py:84}} Aug 30 14:07:07.405415 np0035104604 nova-compute[107505]: DEBUG os_brick.utils [None req-d1bfceaf-0235-49df-a72e-f936df1dbf21 tempest-VolumesAssistedSnapshotsTest-2065217557 tempest-VolumesAssistedSnapshotsTest-2065217557-project-admin] <== get_connector_properties: return (2211ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '149.202.177.86', 'host': 'np0035104604', 'multipath': False, 'initiator': 'iqn.2016-04.com.open-iscsi:4918d4d097bf', 'do_local_attach': False, 'uuid': '2efee5ce-54eb-4251-8b91-cb783587f333', 'system uuid': 'ed1e0f24-f0a1-46a6-80a0-067b49fc4977', 'nvme_native_multipath': False} {{(pid=107505) trace_logging_wrapper /opt/stack/data/venv/lib/python3.10/site-packages/os_brick/utils.py:202}} Aug 30 14:07:07.405999 np0035104604 nova-compute[107505]: DEBUG nova.virt.block_device [None req-d1bfceaf-0235-49df-a72e-f936df1dbf21 tempest-VolumesAssistedSnapshotsTest-2065217557 tempest-VolumesAssistedSnapshotsTest-2065217557-project-admin] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Updating existing volume attachment record: d97dba86-1f1a-4821-aef9-e4da552813cd {{(pid=107505) _volume_attach /opt/stack/nova/nova/virt/block_device.py:631}} Aug 30 14:07:07.410841 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] CMD "genisoimage -o /opt/stack/data/nova/instances/bfa80c86-477c-4e04-96e3-d7fee174f726/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 27.1.0 -quiet -J -r -V config-2 /tmp/tmpxs6tbre6" returned: 0 in 0.029s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:07.450713 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] rbd image bfa80c86-477c-4e04-96e3-d7fee174f726_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:07:07.454484 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/bfa80c86-477c-4e04-96e3-d7fee174f726/disk.config bfa80c86-477c-4e04-96e3-d7fee174f726_disk.config --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:07.555145 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.777s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:07.560813 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:07:07.582428 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:07:07.593052 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] CMD "rbd import --pool vms /opt/stack/data/nova/instances/bfa80c86-477c-4e04-96e3-d7fee174f726/disk.config bfa80c86-477c-4e04-96e3-d7fee174f726_disk.config --image-format=2 --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.139s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:07.593648 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Deleting local config drive /opt/stack/data/nova/instances/bfa80c86-477c-4e04-96e3-d7fee174f726/disk.config because it was imported into RBD. Aug 30 14:07:07.619864 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Compute_service record updated for np0035104604:np0035104604 {{(pid=107505) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} Aug 30 14:07:07.620223 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.675s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:08.488129 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Resumed> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:07:08.488746 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] VM Resumed (Lifecycle Event) Aug 30 14:07:08.492994 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Instance event wait completed in 0 seconds for {{(pid=107505) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Aug 30 14:07:08.493449 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Guest created on hypervisor {{(pid=107505) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4411}} Aug 30 14:07:08.508960 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:07:08.513215 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Instance spawned successfully. Aug 30 14:07:08.515846 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:894}} Aug 30 14:07:08.535851 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:07:08.550405 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Found default for hw_cdrom_bus of ide {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:07:08.550831 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Found default for hw_disk_bus of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:07:08.551946 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Found default for hw_input_bus of None {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:07:08.553497 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Found default for hw_pointer_model of None {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:07:08.554462 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Found default for hw_video_model of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:07:08.555400 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Found default for hw_vif_model of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:07:08.568144 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] During sync_power_state the instance has a pending task (spawning). Skip. Aug 30 14:07:08.569054 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Started> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:07:08.569437 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] VM Started (Lifecycle Event) Aug 30 14:07:08.576614 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-d1bfceaf-0235-49df-a72e-f936df1dbf21 tempest-VolumesAssistedSnapshotsTest-2065217557 tempest-VolumesAssistedSnapshotsTest-2065217557-project-admin] Acquiring lock "cache_volume_driver" by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.._cache_volume_driver" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:08.577171 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-d1bfceaf-0235-49df-a72e-f936df1dbf21 tempest-VolumesAssistedSnapshotsTest-2065217557 tempest-VolumesAssistedSnapshotsTest-2065217557-project-admin] Lock "cache_volume_driver" acquired by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.._cache_volume_driver" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:08.584539 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-d1bfceaf-0235-49df-a72e-f936df1dbf21 tempest-VolumesAssistedSnapshotsTest-2065217557 tempest-VolumesAssistedSnapshotsTest-2065217557-project-admin] Lock "cache_volume_driver" "released" by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.._cache_volume_driver" :: held 0.007s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:08.591064 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:07:08.596562 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:07:08.601343 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-d1bfceaf-0235-49df-a72e-f936df1dbf21 tempest-VolumesAssistedSnapshotsTest-2065217557 tempest-VolumesAssistedSnapshotsTest-2065217557-project-admin] Lazy-loading 'flavor' on Instance uuid a9a74ed7-6eb9-4167-86a3-fe188c08af97 {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:07:08.623998 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-d1bfceaf-0235-49df-a72e-f936df1dbf21 tempest-VolumesAssistedSnapshotsTest-2065217557 tempest-VolumesAssistedSnapshotsTest-2065217557-project-admin] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Attempting to attach volume a84ce1cc-f3b7-4961-9bff-72cc724ba754 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. {{(pid=107505) _check_discard_for_attach_volume /opt/stack/nova/nova/virt/libvirt/driver.py:2158}} Aug 30 14:07:08.626563 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] During sync_power_state the instance has a pending task (spawning). Skip. Aug 30 14:07:08.628415 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.guest [None req-d1bfceaf-0235-49df-a72e-f936df1dbf21 tempest-VolumesAssistedSnapshotsTest-2065217557 tempest-VolumesAssistedSnapshotsTest-2065217557-project-admin] attach device xml: Aug 30 14:07:08.628415 np0035104604 nova-compute[107505]: Aug 30 14:07:08.628415 np0035104604 nova-compute[107505]: Aug 30 14:07:08.628415 np0035104604 nova-compute[107505]: Aug 30 14:07:08.628415 np0035104604 nova-compute[107505]: Aug 30 14:07:08.628415 np0035104604 nova-compute[107505]: Aug 30 14:07:08.628415 np0035104604 nova-compute[107505]: Aug 30 14:07:08.628415 np0035104604 nova-compute[107505]: Aug 30 14:07:08.628415 np0035104604 nova-compute[107505]: Aug 30 14:07:08.628415 np0035104604 nova-compute[107505]: a84ce1cc-f3b7-4961-9bff-72cc724ba754 Aug 30 14:07:08.628415 np0035104604 nova-compute[107505]: Aug 30 14:07:08.628415 np0035104604 nova-compute[107505]: {{(pid=107505) attach_device /opt/stack/nova/nova/virt/libvirt/guest.py:339}} Aug 30 14:07:08.647082 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:08.666786 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Took 3.93 seconds to spawn the instance on the hypervisor. Aug 30 14:07:08.667166 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:07:08.739645 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Took 6.32 seconds to build instance. Aug 30 14:07:08.757167 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-87d0294d-9d40-4299-a4a1-9b1a41b131c4 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Lock "bfa80c86-477c-4e04-96e3-d7fee174f726" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.516s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:08.810518 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-d1bfceaf-0235-49df-a72e-f936df1dbf21 tempest-VolumesAssistedSnapshotsTest-2065217557 tempest-VolumesAssistedSnapshotsTest-2065217557-project-admin] No BDM found with device name vda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:07:08.811286 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-d1bfceaf-0235-49df-a72e-f936df1dbf21 tempest-VolumesAssistedSnapshotsTest-2065217557 tempest-VolumesAssistedSnapshotsTest-2065217557-project-admin] No BDM found with device name vdb, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:07:08.811625 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-d1bfceaf-0235-49df-a72e-f936df1dbf21 tempest-VolumesAssistedSnapshotsTest-2065217557 tempest-VolumesAssistedSnapshotsTest-2065217557-project-admin] No BDM found with device name hda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:07:08.812226 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-d1bfceaf-0235-49df-a72e-f936df1dbf21 tempest-VolumesAssistedSnapshotsTest-2065217557 tempest-VolumesAssistedSnapshotsTest-2065217557-project-admin] No VIF found with MAC fa:16:3e:c3:43:e6, not building metadata {{(pid=107505) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12068}} Aug 30 14:07:08.986405 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-d1bfceaf-0235-49df-a72e-f936df1dbf21 tempest-VolumesAssistedSnapshotsTest-2065217557 tempest-VolumesAssistedSnapshotsTest-2065217557-project-admin] Lock "a9a74ed7-6eb9-4167-86a3-fe188c08af97" "released" by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" :: held 3.948s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:09.767854 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-651685ff-df81-4190-9779-fccd30954c56 tempest-VolumesAssistedSnapshotsTest-4850509 tempest-VolumesAssistedSnapshotsTest-4850509-project] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] volume_snapshot_create: create_info: {'snapshot_id': '77813747-313d-4764-8019-8dc95104b60b', 'type': 'qcow2', 'new_file': 'new_file'} {{(pid=107505) volume_snapshot_create /opt/stack/nova/nova/virt/libvirt/driver.py:3566}} Aug 30 14:07:09.785262 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver [None req-651685ff-df81-4190-9779-fccd30954c56 tempest-VolumesAssistedSnapshotsTest-4850509 tempest-VolumesAssistedSnapshotsTest-4850509-project] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Error occurred during volume_snapshot_create, sending error status to Cinder.: nova.exception.InternalError: Found no disk to snapshot. Aug 30 14:07:09.785262 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Traceback (most recent call last): Aug 30 14:07:09.785262 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] File "/opt/stack/nova/nova/virt/libvirt/driver.py", line 3584, in volume_snapshot_create Aug 30 14:07:09.785262 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] self._volume_snapshot_create(context, instance, guest, Aug 30 14:07:09.785262 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] File "/opt/stack/nova/nova/virt/libvirt/driver.py", line 3471, in _volume_snapshot_create Aug 30 14:07:09.785262 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] raise exception.InternalError(msg) Aug 30 14:07:09.785262 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] nova.exception.InternalError: Found no disk to snapshot. Aug 30 14:07:09.785262 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Aug 30 14:07:09.820961 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-5741f97f-7a08-4850-8067-626564e70c8f tempest-VolumesAssistedSnapshotsTest-4850509 tempest-VolumesAssistedSnapshotsTest-4850509-project] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] volume_snapshot_delete: delete_info: {'volume_id': 'a84ce1cc-f3b7-4961-9bff-72cc724ba754'} {{(pid=107505) _volume_snapshot_delete /opt/stack/nova/nova/virt/libvirt/driver.py:3667}} Aug 30 14:07:09.822006 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver [None req-5741f97f-7a08-4850-8067-626564e70c8f tempest-VolumesAssistedSnapshotsTest-4850509 tempest-VolumesAssistedSnapshotsTest-4850509-project] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Error occurred during volume_snapshot_delete, sending error status to Cinder.: KeyError: 'type' Aug 30 14:07:09.822006 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Traceback (most recent call last): Aug 30 14:07:09.822006 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] File "/opt/stack/nova/nova/virt/libvirt/driver.py", line 3840, in volume_snapshot_delete Aug 30 14:07:09.822006 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] self._volume_snapshot_delete(context, instance, volume_id, Aug 30 14:07:09.822006 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] File "/opt/stack/nova/nova/virt/libvirt/driver.py", line 3670, in _volume_snapshot_delete Aug 30 14:07:09.822006 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] if delete_info['type'] != 'qcow2': Aug 30 14:07:09.822006 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] KeyError: 'type' Aug 30 14:07:09.822006 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver [None req-651685ff-df81-4190-9779-fccd30954c56 tempest-VolumesAssistedSnapshotsTest-4850509 tempest-VolumesAssistedSnapshotsTest-4850509-project] Failed to send updated snapshot status to volume service.: nova.exception.SnapshotNotFound: Snapshot 77813747-313d-4764-8019-8dc95104b60b could not be found. Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver Traceback (most recent call last): Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver File "/opt/stack/nova/nova/virt/libvirt/driver.py", line 3584, in volume_snapshot_create Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver self._volume_snapshot_create(context, instance, guest, Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver File "/opt/stack/nova/nova/virt/libvirt/driver.py", line 3471, in _volume_snapshot_create Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver raise exception.InternalError(msg) Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver nova.exception.InternalError: Found no disk to snapshot. Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver During handling of the above exception, another exception occurred: Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver Traceback (most recent call last): Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver File "/opt/stack/nova/nova/volume/cinder.py", line 466, in wrapper Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver res = method(self, ctx, snapshot_id, *args, **kwargs) Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver File "/opt/stack/nova/nova/volume/cinder.py", line 761, in update_snapshot_status Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver vs.update_snapshot_status( Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver File "/opt/stack/data/venv/lib/python3.10/site-packages/cinderclient/v3/volume_snapshots.py", line 225, in update_snapshot_status Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver return self._action('os-update_snapshot_status', Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver File "/opt/stack/data/venv/lib/python3.10/site-packages/cinderclient/v3/volume_snapshots.py", line 221, in _action Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver resp, body = self.api.client.post(url, body=body) Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver File "/opt/stack/data/venv/lib/python3.10/site-packages/cinderclient/client.py", line 223, in post Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver return self._cs_request(url, 'POST', **kwargs) Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver File "/opt/stack/data/venv/lib/python3.10/site-packages/cinderclient/client.py", line 211, in _cs_request Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver return self.request(url, method, **kwargs) Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver File "/opt/stack/data/venv/lib/python3.10/site-packages/cinderclient/client.py", line 197, in request Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver raise exceptions.from_response(resp, body) Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver cinderclient.exceptions.NotFound: Snapshot 77813747-313d-4764-8019-8dc95104b60b could not be found. (HTTP 404) (Request-ID: req-6a055e8c-85df-407d-b01a-ea09305158d9) Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver During handling of the above exception, another exception occurred: Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver Traceback (most recent call last): Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver File "/opt/stack/nova/nova/virt/libvirt/driver.py", line 3406, in _volume_snapshot_update_status Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver self._volume_api.update_snapshot_status(context, Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver File "/opt/stack/nova/nova/volume/cinder.py", line 397, in wrapper Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver res = method(self, ctx, *args, **kwargs) Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver File "/opt/stack/nova/nova/volume/cinder.py", line 468, in wrapper Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver _reraise(exception.SnapshotNotFound(snapshot_id=snapshot_id)) Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver File "/opt/stack/nova/nova/volume/cinder.py", line 488, in _reraise Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver raise desired_exc.with_traceback(sys.exc_info()[2]) Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver File "/opt/stack/nova/nova/volume/cinder.py", line 466, in wrapper Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver res = method(self, ctx, snapshot_id, *args, **kwargs) Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver File "/opt/stack/nova/nova/volume/cinder.py", line 761, in update_snapshot_status Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver vs.update_snapshot_status( Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver File "/opt/stack/data/venv/lib/python3.10/site-packages/cinderclient/v3/volume_snapshots.py", line 225, in update_snapshot_status Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver return self._action('os-update_snapshot_status', Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver File "/opt/stack/data/venv/lib/python3.10/site-packages/cinderclient/v3/volume_snapshots.py", line 221, in _action Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver resp, body = self.api.client.post(url, body=body) Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver File "/opt/stack/data/venv/lib/python3.10/site-packages/cinderclient/client.py", line 223, in post Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver return self._cs_request(url, 'POST', **kwargs) Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver File "/opt/stack/data/venv/lib/python3.10/site-packages/cinderclient/client.py", line 211, in _cs_request Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver return self.request(url, method, **kwargs) Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver File "/opt/stack/data/venv/lib/python3.10/site-packages/cinderclient/client.py", line 197, in request Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver raise exceptions.from_response(resp, body) Aug 30 14:07:09.882854 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver nova.exception.SnapshotNotFound: Snapshot 77813747-313d-4764-8019-8dc95104b60b could not be found. Aug 30 14:07:09.888428 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver Aug 30 14:07:09.903516 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server [None req-651685ff-df81-4190-9779-fccd30954c56 tempest-VolumesAssistedSnapshotsTest-4850509 tempest-VolumesAssistedSnapshotsTest-4850509-project] Exception during message handling: nova.exception.InternalError: Found no disk to snapshot. Aug 30 14:07:09.903516 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): Aug 30 14:07:09.903516 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming Aug 30 14:07:09.903516 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) Aug 30 14:07:09.903516 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch Aug 30 14:07:09.903516 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) Aug 30 14:07:09.903516 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch Aug 30 14:07:09.903516 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) Aug 30 14:07:09.903516 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 244, in inner Aug 30 14:07:09.903516 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server return func(*args, **kwargs) Aug 30 14:07:09.903516 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped Aug 30 14:07:09.903516 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): Aug 30 14:07:09.903516 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ Aug 30 14:07:09.903516 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server self.force_reraise() Aug 30 14:07:09.903516 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise Aug 30 14:07:09.903516 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server raise self.value Aug 30 14:07:09.903516 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped Aug 30 14:07:09.903516 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) Aug 30 14:07:09.903516 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 4425, in volume_snapshot_create Aug 30 14:07:09.903516 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server self.driver.volume_snapshot_create(context, instance, volume_id, Aug 30 14:07:09.903516 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/virt/libvirt/driver.py", line 3587, in volume_snapshot_create Aug 30 14:07:09.903516 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): Aug 30 14:07:09.903516 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ Aug 30 14:07:09.903516 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server self.force_reraise() Aug 30 14:07:09.903516 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise Aug 30 14:07:09.903516 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server raise self.value Aug 30 14:07:09.903516 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/virt/libvirt/driver.py", line 3584, in volume_snapshot_create Aug 30 14:07:09.903516 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server self._volume_snapshot_create(context, instance, guest, Aug 30 14:07:09.903516 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/virt/libvirt/driver.py", line 3471, in _volume_snapshot_create Aug 30 14:07:09.903516 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server raise exception.InternalError(msg) Aug 30 14:07:09.903516 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server nova.exception.InternalError: Found no disk to snapshot. Aug 30 14:07:09.903516 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver [None req-5741f97f-7a08-4850-8067-626564e70c8f tempest-VolumesAssistedSnapshotsTest-4850509 tempest-VolumesAssistedSnapshotsTest-4850509-project] Failed to send updated snapshot status to volume service.: nova.exception.SnapshotNotFound: Snapshot None could not be found. Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver Traceback (most recent call last): Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver File "/opt/stack/nova/nova/virt/libvirt/driver.py", line 3840, in volume_snapshot_delete Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver self._volume_snapshot_delete(context, instance, volume_id, Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver File "/opt/stack/nova/nova/virt/libvirt/driver.py", line 3670, in _volume_snapshot_delete Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver if delete_info['type'] != 'qcow2': Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver KeyError: 'type' Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver During handling of the above exception, another exception occurred: Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver Traceback (most recent call last): Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver File "/opt/stack/nova/nova/volume/cinder.py", line 466, in wrapper Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver res = method(self, ctx, snapshot_id, *args, **kwargs) Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver File "/opt/stack/nova/nova/volume/cinder.py", line 761, in update_snapshot_status Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver vs.update_snapshot_status( Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver File "/opt/stack/data/venv/lib/python3.10/site-packages/cinderclient/v3/volume_snapshots.py", line 225, in update_snapshot_status Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver return self._action('os-update_snapshot_status', Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver File "/opt/stack/data/venv/lib/python3.10/site-packages/cinderclient/v3/volume_snapshots.py", line 221, in _action Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver resp, body = self.api.client.post(url, body=body) Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver File "/opt/stack/data/venv/lib/python3.10/site-packages/cinderclient/client.py", line 223, in post Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver return self._cs_request(url, 'POST', **kwargs) Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver File "/opt/stack/data/venv/lib/python3.10/site-packages/cinderclient/client.py", line 211, in _cs_request Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver return self.request(url, method, **kwargs) Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver File "/opt/stack/data/venv/lib/python3.10/site-packages/cinderclient/client.py", line 197, in request Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver raise exceptions.from_response(resp, body) Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver cinderclient.exceptions.NotFound: Snapshot None could not be found. (HTTP 404) (Request-ID: req-873c409f-76ef-423b-bf47-8296490c8a18) Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver During handling of the above exception, another exception occurred: Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver Traceback (most recent call last): Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver File "/opt/stack/nova/nova/virt/libvirt/driver.py", line 3406, in _volume_snapshot_update_status Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver self._volume_api.update_snapshot_status(context, Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver File "/opt/stack/nova/nova/volume/cinder.py", line 397, in wrapper Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver res = method(self, ctx, *args, **kwargs) Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver File "/opt/stack/nova/nova/volume/cinder.py", line 468, in wrapper Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver _reraise(exception.SnapshotNotFound(snapshot_id=snapshot_id)) Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver File "/opt/stack/nova/nova/volume/cinder.py", line 488, in _reraise Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver raise desired_exc.with_traceback(sys.exc_info()[2]) Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver File "/opt/stack/nova/nova/volume/cinder.py", line 466, in wrapper Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver res = method(self, ctx, snapshot_id, *args, **kwargs) Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver File "/opt/stack/nova/nova/volume/cinder.py", line 761, in update_snapshot_status Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver vs.update_snapshot_status( Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver File "/opt/stack/data/venv/lib/python3.10/site-packages/cinderclient/v3/volume_snapshots.py", line 225, in update_snapshot_status Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver return self._action('os-update_snapshot_status', Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver File "/opt/stack/data/venv/lib/python3.10/site-packages/cinderclient/v3/volume_snapshots.py", line 221, in _action Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver resp, body = self.api.client.post(url, body=body) Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver File "/opt/stack/data/venv/lib/python3.10/site-packages/cinderclient/client.py", line 223, in post Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver return self._cs_request(url, 'POST', **kwargs) Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver File "/opt/stack/data/venv/lib/python3.10/site-packages/cinderclient/client.py", line 211, in _cs_request Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver return self.request(url, method, **kwargs) Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver File "/opt/stack/data/venv/lib/python3.10/site-packages/cinderclient/client.py", line 197, in request Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver raise exceptions.from_response(resp, body) Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver nova.exception.SnapshotNotFound: Snapshot None could not be found. Aug 30 14:07:09.906544 np0035104604 nova-compute[107505]: ERROR nova.virt.libvirt.driver Aug 30 14:07:09.917357 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server [None req-5741f97f-7a08-4850-8067-626564e70c8f tempest-VolumesAssistedSnapshotsTest-4850509 tempest-VolumesAssistedSnapshotsTest-4850509-project] Exception during message handling: KeyError: 'type' Aug 30 14:07:09.917357 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): Aug 30 14:07:09.917357 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming Aug 30 14:07:09.917357 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) Aug 30 14:07:09.917357 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch Aug 30 14:07:09.917357 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) Aug 30 14:07:09.917357 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch Aug 30 14:07:09.917357 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) Aug 30 14:07:09.917357 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 244, in inner Aug 30 14:07:09.917357 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server return func(*args, **kwargs) Aug 30 14:07:09.917357 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped Aug 30 14:07:09.917357 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): Aug 30 14:07:09.917357 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ Aug 30 14:07:09.917357 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server self.force_reraise() Aug 30 14:07:09.917357 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise Aug 30 14:07:09.917357 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server raise self.value Aug 30 14:07:09.917357 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped Aug 30 14:07:09.917357 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) Aug 30 14:07:09.917357 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 4437, in volume_snapshot_delete Aug 30 14:07:09.917357 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server self.driver.volume_snapshot_delete(context, instance, volume_id, Aug 30 14:07:09.917357 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/virt/libvirt/driver.py", line 3843, in volume_snapshot_delete Aug 30 14:07:09.917357 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): Aug 30 14:07:09.917357 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ Aug 30 14:07:09.917357 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server self.force_reraise() Aug 30 14:07:09.917357 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise Aug 30 14:07:09.917357 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server raise self.value Aug 30 14:07:09.917357 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/virt/libvirt/driver.py", line 3840, in volume_snapshot_delete Aug 30 14:07:09.917357 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server self._volume_snapshot_delete(context, instance, volume_id, Aug 30 14:07:09.917357 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/virt/libvirt/driver.py", line 3670, in _volume_snapshot_delete Aug 30 14:07:09.917357 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server if delete_info['type'] != 'qcow2': Aug 30 14:07:09.917357 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server KeyError: 'type' Aug 30 14:07:09.917357 np0035104604 nova-compute[107505]: ERROR oslo_messaging.rpc.server Aug 30 14:07:10.211539 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:10.248112 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-5d18515e-fe1c-49ce-b298-4bf2d1644f48 tempest-VolumesAssistedSnapshotsTest-2065217557 tempest-VolumesAssistedSnapshotsTest-2065217557-project-admin] Acquiring lock "a9a74ed7-6eb9-4167-86a3-fe188c08af97" by "nova.compute.manager.ComputeManager.detach_volume..do_detach_volume" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:10.248528 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-5d18515e-fe1c-49ce-b298-4bf2d1644f48 tempest-VolumesAssistedSnapshotsTest-2065217557 tempest-VolumesAssistedSnapshotsTest-2065217557-project-admin] Lock "a9a74ed7-6eb9-4167-86a3-fe188c08af97" acquired by "nova.compute.manager.ComputeManager.detach_volume..do_detach_volume" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:10.265109 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-5d18515e-fe1c-49ce-b298-4bf2d1644f48 tempest-VolumesAssistedSnapshotsTest-2065217557 tempest-VolumesAssistedSnapshotsTest-2065217557-project-admin] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Detaching volume a84ce1cc-f3b7-4961-9bff-72cc724ba754 Aug 30 14:07:10.366675 np0035104604 nova-compute[107505]: INFO nova.virt.block_device [None req-5d18515e-fe1c-49ce-b298-4bf2d1644f48 tempest-VolumesAssistedSnapshotsTest-2065217557 tempest-VolumesAssistedSnapshotsTest-2065217557-project-admin] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Attempting to driver detach volume a84ce1cc-f3b7-4961-9bff-72cc724ba754 from mountpoint /dev/vdb Aug 30 14:07:10.378257 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-5d18515e-fe1c-49ce-b298-4bf2d1644f48 tempest-VolumesAssistedSnapshotsTest-2065217557 tempest-VolumesAssistedSnapshotsTest-2065217557-project-admin] Attempting to detach device vdb from instance a9a74ed7-6eb9-4167-86a3-fe188c08af97 from the persistent domain config. {{(pid=107505) _detach_from_persistent /opt/stack/nova/nova/virt/libvirt/driver.py:2477}} Aug 30 14:07:10.378718 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.guest [None req-5d18515e-fe1c-49ce-b298-4bf2d1644f48 tempest-VolumesAssistedSnapshotsTest-2065217557 tempest-VolumesAssistedSnapshotsTest-2065217557-project-admin] detach device xml: Aug 30 14:07:10.378718 np0035104604 nova-compute[107505]: Aug 30 14:07:10.378718 np0035104604 nova-compute[107505]: Aug 30 14:07:10.378718 np0035104604 nova-compute[107505]: Aug 30 14:07:10.378718 np0035104604 nova-compute[107505]: Aug 30 14:07:10.378718 np0035104604 nova-compute[107505]: Aug 30 14:07:10.378718 np0035104604 nova-compute[107505]: a84ce1cc-f3b7-4961-9bff-72cc724ba754 Aug 30 14:07:10.378718 np0035104604 nova-compute[107505]:
Aug 30 14:07:10.378718 np0035104604 nova-compute[107505]: Aug 30 14:07:10.378718 np0035104604 nova-compute[107505]: {{(pid=107505) detach_device /opt/stack/nova/nova/virt/libvirt/guest.py:465}} Aug 30 14:07:10.390149 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-5d18515e-fe1c-49ce-b298-4bf2d1644f48 tempest-VolumesAssistedSnapshotsTest-2065217557 tempest-VolumesAssistedSnapshotsTest-2065217557-project-admin] Successfully detached device vdb from instance a9a74ed7-6eb9-4167-86a3-fe188c08af97 from the persistent domain config. Aug 30 14:07:10.390677 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-5d18515e-fe1c-49ce-b298-4bf2d1644f48 tempest-VolumesAssistedSnapshotsTest-2065217557 tempest-VolumesAssistedSnapshotsTest-2065217557-project-admin] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance a9a74ed7-6eb9-4167-86a3-fe188c08af97 from the live domain config. {{(pid=107505) _detach_from_live_with_retry /opt/stack/nova/nova/virt/libvirt/driver.py:2513}} Aug 30 14:07:10.391218 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.guest [None req-5d18515e-fe1c-49ce-b298-4bf2d1644f48 tempest-VolumesAssistedSnapshotsTest-2065217557 tempest-VolumesAssistedSnapshotsTest-2065217557-project-admin] detach device xml: Aug 30 14:07:10.391218 np0035104604 nova-compute[107505]: Aug 30 14:07:10.391218 np0035104604 nova-compute[107505]: Aug 30 14:07:10.391218 np0035104604 nova-compute[107505]: Aug 30 14:07:10.391218 np0035104604 nova-compute[107505]: Aug 30 14:07:10.391218 np0035104604 nova-compute[107505]: Aug 30 14:07:10.391218 np0035104604 nova-compute[107505]: a84ce1cc-f3b7-4961-9bff-72cc724ba754 Aug 30 14:07:10.391218 np0035104604 nova-compute[107505]:
Aug 30 14:07:10.391218 np0035104604 nova-compute[107505]: Aug 30 14:07:10.391218 np0035104604 nova-compute[107505]: {{(pid=107505) detach_device /opt/stack/nova/nova/virt/libvirt/guest.py:465}} Aug 30 14:07:10.502555 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:10.614421 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:07:10.614836 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:07:10.617247 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:07:10.617812 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Starting heal instance info cache {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9842}} Aug 30 14:07:10.618140 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Rebuilding the list of instances to heal {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9846}} Aug 30 14:07:10.895421 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "refresh_cache-a9a74ed7-6eb9-4167-86a3-fe188c08af97" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:07:10.896519 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquired lock "refresh_cache-a9a74ed7-6eb9-4167-86a3-fe188c08af97" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:07:10.897167 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Forcefully refreshing network info cache for instance {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1996}} Aug 30 14:07:10.898258 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lazy-loading 'info_cache' on Instance uuid a9a74ed7-6eb9-4167-86a3-fe188c08af97 {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:07:11.119005 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Rebuilding instance Aug 30 14:07:11.586356 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Received event virtio-disk1> from libvirt while the driver is waiting for it; dispatched. {{(pid=107505) emit_event /opt/stack/nova/nova/virt/libvirt/driver.py:2360}} Aug 30 14:07:11.587007 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-5d18515e-fe1c-49ce-b298-4bf2d1644f48 tempest-VolumesAssistedSnapshotsTest-2065217557 tempest-VolumesAssistedSnapshotsTest-2065217557-project-admin] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance a9a74ed7-6eb9-4167-86a3-fe188c08af97 {{(pid=107505) _detach_from_live_and_wait_for_event /opt/stack/nova/nova/virt/libvirt/driver.py:2589}} Aug 30 14:07:11.592709 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-5d18515e-fe1c-49ce-b298-4bf2d1644f48 tempest-VolumesAssistedSnapshotsTest-2065217557 tempest-VolumesAssistedSnapshotsTest-2065217557-project-admin] Successfully detached device vdb from instance a9a74ed7-6eb9-4167-86a3-fe188c08af97 from the live domain config. Aug 30 14:07:11.905848 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:07:11.959417 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-5d18515e-fe1c-49ce-b298-4bf2d1644f48 tempest-VolumesAssistedSnapshotsTest-2065217557 tempest-VolumesAssistedSnapshotsTest-2065217557-project-admin] Lazy-loading 'flavor' on Instance uuid a9a74ed7-6eb9-4167-86a3-fe188c08af97 {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:07:12.149496 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-5d18515e-fe1c-49ce-b298-4bf2d1644f48 tempest-VolumesAssistedSnapshotsTest-2065217557 tempest-VolumesAssistedSnapshotsTest-2065217557-project-admin] Lock "a9a74ed7-6eb9-4167-86a3-fe188c08af97" "released" by "nova.compute.manager.ComputeManager.detach_volume..do_detach_volume" :: held 1.901s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:12.195839 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Instance destroyed successfully. Aug 30 14:07:12.202771 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Instance destroyed successfully. Aug 30 14:07:12.976861 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Deleting instance files /opt/stack/data/nova/instances/bfa80c86-477c-4e04-96e3-d7fee174f726_del Aug 30 14:07:12.978086 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Deletion of /opt/stack/data/nova/instances/bfa80c86-477c-4e04-96e3-d7fee174f726_del complete Aug 30 14:07:12.987717 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:13.078737 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Updating instance_info_cache with network_info: [{"id": "246226ea-1f45-4c14-95cc-92ee432293be", "address": "fa:16:3e:c3:43:e6", "network": {"id": "dc6199ed-e883-401b-91de-b1a4e5a0056d", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-1905884620-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.5.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "52df4bef677b4f86bc2cd5ee880b7cc1", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap246226ea-1f", "ovs_interfaceid": "246226ea-1f45-4c14-95cc-92ee432293be", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:07:13.111181 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Releasing lock "refresh_cache-a9a74ed7-6eb9-4167-86a3-fe188c08af97" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:07:13.112042 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Updated the network info_cache for instance {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} Aug 30 14:07:13.113693 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:07:13.114272 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:07:13.114766 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:07:13.115248 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:07:13.116748 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:07:13.117430 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:07:13.117937 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=107505) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10461}} Aug 30 14:07:13.246863 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Creating instance directory {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4717}} Aug 30 14:07:13.247854 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Creating image(s) Aug 30 14:07:13.285070 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] rbd image bfa80c86-477c-4e04-96e3-d7fee174f726_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:07:13.321373 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] rbd image bfa80c86-477c-4e04-96e3-d7fee174f726_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:07:13.357431 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] rbd image bfa80c86-477c-4e04-96e3-d7fee174f726_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:07:13.361555 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Acquiring lock "d279ce3c0a0edcb9bee120b54ec031d664e9e8d4" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:13.362726 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Lock "d279ce3c0a0edcb9bee120b54ec031d664e9e8d4" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.002s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:13.647820 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:13.654965 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.imagebackend [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Image locations are: [{'url': 'rbd://b94dc20f-5638-48f6-87b0-a600a745ee27/images/bd27109d-8758-41af-af44-0e764a2addca/snap', 'metadata': {}}] {{(pid=107505) clone /opt/stack/nova/nova/virt/libvirt/imagebackend.py:1070}} Aug 30 14:07:13.971897 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-80d9d554-cf3d-44c5-aba5-1e1e8d1c9a38 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Acquiring lock "a9a74ed7-6eb9-4167-86a3-fe188c08af97" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:13.972346 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-80d9d554-cf3d-44c5-aba5-1e1e8d1c9a38 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Lock "a9a74ed7-6eb9-4167-86a3-fe188c08af97" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:13.972801 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-80d9d554-cf3d-44c5-aba5-1e1e8d1c9a38 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Acquiring lock "a9a74ed7-6eb9-4167-86a3-fe188c08af97-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:13.973370 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-80d9d554-cf3d-44c5-aba5-1e1e8d1c9a38 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Lock "a9a74ed7-6eb9-4167-86a3-fe188c08af97-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:13.973863 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-80d9d554-cf3d-44c5-aba5-1e1e8d1c9a38 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Lock "a9a74ed7-6eb9-4167-86a3-fe188c08af97-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:13.977263 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-80d9d554-cf3d-44c5-aba5-1e1e8d1c9a38 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Terminating instance Aug 30 14:07:13.981482 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-80d9d554-cf3d-44c5-aba5-1e1e8d1c9a38 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Start destroying the instance on the hypervisor. {{(pid=107505) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} Aug 30 14:07:14.147010 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:14.163662 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:14.196607 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:14.225786 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:14.670538 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d279ce3c0a0edcb9bee120b54ec031d664e9e8d4.part --force-share --output=json {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:14.691751 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:14.697495 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-6f7bf6f8-1abb-4290-9bca-ed0b0f39e1c6 req-cee411d4-92f3-424d-976f-04bab4c8d216 service nova] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Received event network-vif-unplugged-246226ea-1f45-4c14-95cc-92ee432293be {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:07:14.698343 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-6f7bf6f8-1abb-4290-9bca-ed0b0f39e1c6 req-cee411d4-92f3-424d-976f-04bab4c8d216 service nova] Acquiring lock "a9a74ed7-6eb9-4167-86a3-fe188c08af97-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:14.698953 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-6f7bf6f8-1abb-4290-9bca-ed0b0f39e1c6 req-cee411d4-92f3-424d-976f-04bab4c8d216 service nova] Lock "a9a74ed7-6eb9-4167-86a3-fe188c08af97-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:14.699403 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-6f7bf6f8-1abb-4290-9bca-ed0b0f39e1c6 req-cee411d4-92f3-424d-976f-04bab4c8d216 service nova] Lock "a9a74ed7-6eb9-4167-86a3-fe188c08af97-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:14.699801 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-6f7bf6f8-1abb-4290-9bca-ed0b0f39e1c6 req-cee411d4-92f3-424d-976f-04bab4c8d216 service nova] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] No waiting events found dispatching network-vif-unplugged-246226ea-1f45-4c14-95cc-92ee432293be {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:07:14.700257 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-6f7bf6f8-1abb-4290-9bca-ed0b0f39e1c6 req-cee411d4-92f3-424d-976f-04bab4c8d216 service nova] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Received event network-vif-unplugged-246226ea-1f45-4c14-95cc-92ee432293be for instance with task_state deleting. {{(pid=107505) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10810}} Aug 30 14:07:14.704804 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Instance destroyed successfully. Aug 30 14:07:14.706390 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-80d9d554-cf3d-44c5-aba5-1e1e8d1c9a38 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Lazy-loading 'resources' on Instance uuid a9a74ed7-6eb9-4167-86a3-fe188c08af97 {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:07:14.720788 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-80d9d554-cf3d-44c5-aba5-1e1e8d1c9a38 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2023-08-30T14:05:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-VolumesAssistedSnapshotsTest-server-1478890135',display_name='tempest-VolumesAssistedSnapshotsTest-server-1478890135',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-volumesassistedsnapshotstest-server-1478890135',id=1,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFqC8uh2+YoNrXptc5ixgT/phgGQy0l9+/1kPf9/2AAS72msw7v6bohpg0KzDaAlfBGjXfBxccFhilsQDIM2reOBWA+UPM9yHEzoOy5G3dUW1uITle7QYHXulfuGIetqcQ==',key_name='tempest-keypair-649641997',keypairs=,launch_index=0,launched_at=2023-08-30T14:05:59Z,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='52df4bef677b4f86bc2cd5ee880b7cc1',ramdisk_id='',reservation_id='r-ooxvtgat',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-VolumesAssistedSnapshotsTest-74693406',owner_user_name='tempest-VolumesAssistedSnapshotsTest-74693406-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-08-30T14:05:59Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='113a433e1bcc438ea91d3b1b0508f251',uuid=a9a74ed7-6eb9-4167-86a3-fe188c08af97,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "246226ea-1f45-4c14-95cc-92ee432293be", "address": "fa:16:3e:c3:43:e6", "network": {"id": "dc6199ed-e883-401b-91de-b1a4e5a0056d", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-1905884620-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.5.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "52df4bef677b4f86bc2cd5ee880b7cc1", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap246226ea-1f", "ovs_interfaceid": "246226ea-1f45-4c14-95cc-92ee432293be", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Aug 30 14:07:14.721344 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-80d9d554-cf3d-44c5-aba5-1e1e8d1c9a38 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Converting VIF {"id": "246226ea-1f45-4c14-95cc-92ee432293be", "address": "fa:16:3e:c3:43:e6", "network": {"id": "dc6199ed-e883-401b-91de-b1a4e5a0056d", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-1905884620-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.5.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "52df4bef677b4f86bc2cd5ee880b7cc1", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap246226ea-1f", "ovs_interfaceid": "246226ea-1f45-4c14-95cc-92ee432293be", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:07:14.722830 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-80d9d554-cf3d-44c5-aba5-1e1e8d1c9a38 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c3:43:e6,bridge_name='br-int',has_traffic_filtering=True,id=246226ea-1f45-4c14-95cc-92ee432293be,network=Network(dc6199ed-e883-401b-91de-b1a4e5a0056d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap246226ea-1f') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:07:14.723330 np0035104604 nova-compute[107505]: DEBUG os_vif [None req-80d9d554-cf3d-44c5-aba5-1e1e8d1c9a38 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c3:43:e6,bridge_name='br-int',has_traffic_filtering=True,id=246226ea-1f45-4c14-95cc-92ee432293be,network=Network(dc6199ed-e883-401b-91de-b1a4e5a0056d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap246226ea-1f') {{(pid=107505) unplug /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:109}} Aug 30 14:07:14.727113 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:14.727612 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap246226ea-1f, bridge=br-int, if_exists=True) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:07:14.741050 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:14.741050 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:07:14.751451 np0035104604 nova-compute[107505]: INFO os_vif [None req-80d9d554-cf3d-44c5-aba5-1e1e8d1c9a38 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c3:43:e6,bridge_name='br-int',has_traffic_filtering=True,id=246226ea-1f45-4c14-95cc-92ee432293be,network=Network(dc6199ed-e883-401b-91de-b1a4e5a0056d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap246226ea-1f') Aug 30 14:07:14.799621 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] CMD "/opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d279ce3c0a0edcb9bee120b54ec031d664e9e8d4.part --force-share --output=json" returned: 0 in 0.133s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:14.799621 np0035104604 nova-compute[107505]: DEBUG nova.virt.images [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] bd27109d-8758-41af-af44-0e764a2addca was qcow2, converting to raw {{(pid=107505) fetch_to_raw /opt/stack/nova/nova/virt/images.py:165}} Aug 30 14:07:14.801676 np0035104604 nova-compute[107505]: DEBUG nova.privsep.utils [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=107505) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Aug 30 14:07:14.802386 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/d279ce3c0a0edcb9bee120b54ec031d664e9e8d4.part /opt/stack/data/nova/instances/_base/d279ce3c0a0edcb9bee120b54ec031d664e9e8d4.converted {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:14.953562 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] CMD "qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/d279ce3c0a0edcb9bee120b54ec031d664e9e8d4.part /opt/stack/data/nova/instances/_base/d279ce3c0a0edcb9bee120b54ec031d664e9e8d4.converted" returned: 0 in 0.150s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:14.960804 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d279ce3c0a0edcb9bee120b54ec031d664e9e8d4.converted --force-share --output=json {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:15.040318 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] CMD "/opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d279ce3c0a0edcb9bee120b54ec031d664e9e8d4.converted --force-share --output=json" returned: 0 in 0.079s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:15.041440 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Lock "d279ce3c0a0edcb9bee120b54ec031d664e9e8d4" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 1.679s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:15.070070 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] rbd image bfa80c86-477c-4e04-96e3-d7fee174f726_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:07:15.077069 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/_base/d279ce3c0a0edcb9bee120b54ec031d664e9e8d4 bfa80c86-477c-4e04-96e3-d7fee174f726_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:15.151685 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:07:15.152176 np0035104604 nova-compute[107505]: INFO nova.compute.manager [-] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] VM Stopped (Lifecycle Event) Aug 30 14:07:15.203131 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-54ee5c6c-66e0-443b-af83-1df5255eeaa9 None None] [instance: 047619d3-96a4-42df-9e90-f87c5ce4ce1c] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:07:15.833187 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] CMD "rbd import --pool vms /opt/stack/data/nova/instances/_base/d279ce3c0a0edcb9bee120b54ec031d664e9e8d4 bfa80c86-477c-4e04-96e3-d7fee174f726_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.755s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:15.891868 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-80d9d554-cf3d-44c5-aba5-1e1e8d1c9a38 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Deleting instance files /opt/stack/data/nova/instances/a9a74ed7-6eb9-4167-86a3-fe188c08af97_del Aug 30 14:07:15.892920 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-80d9d554-cf3d-44c5-aba5-1e1e8d1c9a38 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Deletion of /opt/stack/data/nova/instances/a9a74ed7-6eb9-4167-86a3-fe188c08af97_del complete Aug 30 14:07:16.022687 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] resizing rbd image bfa80c86-477c-4e04-96e3-d7fee174f726_disk to 1073741824 {{(pid=107505) resize /opt/stack/nova/nova/storage/rbd_utils.py:288}} Aug 30 14:07:16.077846 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-80d9d554-cf3d-44c5-aba5-1e1e8d1c9a38 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Took 2.10 seconds to destroy the instance on the hypervisor. Aug 30 14:07:16.078659 np0035104604 nova-compute[107505]: DEBUG oslo.service.loopingcall [None req-80d9d554-cf3d-44c5-aba5-1e1e8d1c9a38 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=107505) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} Aug 30 14:07:16.079609 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [-] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Deallocating network for instance {{(pid=107505) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} Aug 30 14:07:16.079879 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [-] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] deallocate_for_instance() {{(pid=107505) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} Aug 30 14:07:16.432875 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Created local disks {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4851}} Aug 30 14:07:16.433610 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Ensure instance console log exists: /opt/stack/data/nova/instances/bfa80c86-477c-4e04-96e3-d7fee174f726/console.log {{(pid=107505) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4603}} Aug 30 14:07:16.434450 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:16.435087 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:16.435535 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:16.438309 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'ide', 'dev': 'hda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='0c839612eb3f2469420f2ccae990827f',container_format='bare',created_at=2023-08-30T14:01:42Z,direct_url=,disk_format='qcow2',id=bd27109d-8758-41af-af44-0e764a2addca,min_disk=0,min_ram=0,name='cirros-0.6.1-x86_64-disk',owner='7f1ed4be4a61468fa9c662d2b5fbcb56',properties=ImageMetaProps,protected=,size=21233664,status='active',tags=,updated_at=2023-08-30T14:01:49Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'disk_bus': 'virtio', 'image_id': 'a32d892d-d3ba-4b7e-9bab-7e06f730b9e0'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=107505) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7553}} Aug 30 14:07:16.446546 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError Aug 30 14:07:16.449487 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Searching host: 'np0035104604' for CPU controller through CGroups V1... {{(pid=107505) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1650}} Aug 30 14:07:16.450642 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] CPU controller missing on host. {{(pid=107505) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1660}} Aug 30 14:07:16.454085 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Searching host: 'np0035104604' for CPU controller through CGroups V2... {{(pid=107505) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1669}} Aug 30 14:07:16.454886 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] CPU controller found on host. {{(pid=107505) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1676}} Aug 30 14:07:16.459201 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=107505) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5390}} Aug 30 14:07:16.460204 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-08-30T14:02:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='0c839612eb3f2469420f2ccae990827f',container_format='bare',created_at=2023-08-30T14:01:42Z,direct_url=,disk_format='qcow2',id=bd27109d-8758-41af-af44-0e764a2addca,min_disk=0,min_ram=0,name='cirros-0.6.1-x86_64-disk',owner='7f1ed4be4a61468fa9c662d2b5fbcb56',properties=ImageMetaProps,protected=,size=21233664,status='active',tags=,updated_at=2023-08-30T14:01:49Z,virtual_size=,visibility=), allow threads: True {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Aug 30 14:07:16.460934 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Flavor limits 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Aug 30 14:07:16.461511 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Image limits 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Aug 30 14:07:16.462227 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Flavor pref 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Aug 30 14:07:16.462807 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Image pref 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Aug 30 14:07:16.463397 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Aug 30 14:07:16.464061 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Aug 30 14:07:16.464658 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=107505) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Aug 30 14:07:16.465269 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Got 1 possible topologies {{(pid=107505) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Aug 30 14:07:16.466375 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Aug 30 14:07:16.466982 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Aug 30 14:07:16.467444 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Lazy-loading 'vcpu_model' on Instance uuid bfa80c86-477c-4e04-96e3-d7fee174f726 {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:07:16.508158 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:16.646709 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-3dee487d-685c-4db9-8fc2-916fab7db722 req-77452868-2b00-48e9-bdbc-dd4faed9f1bf service nova] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Received event network-vif-plugged-246226ea-1f45-4c14-95cc-92ee432293be {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:07:16.647256 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-3dee487d-685c-4db9-8fc2-916fab7db722 req-77452868-2b00-48e9-bdbc-dd4faed9f1bf service nova] Acquiring lock "a9a74ed7-6eb9-4167-86a3-fe188c08af97-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:16.647821 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-3dee487d-685c-4db9-8fc2-916fab7db722 req-77452868-2b00-48e9-bdbc-dd4faed9f1bf service nova] Lock "a9a74ed7-6eb9-4167-86a3-fe188c08af97-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:16.648800 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-3dee487d-685c-4db9-8fc2-916fab7db722 req-77452868-2b00-48e9-bdbc-dd4faed9f1bf service nova] Lock "a9a74ed7-6eb9-4167-86a3-fe188c08af97-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:16.649186 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-3dee487d-685c-4db9-8fc2-916fab7db722 req-77452868-2b00-48e9-bdbc-dd4faed9f1bf service nova] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] No waiting events found dispatching network-vif-plugged-246226ea-1f45-4c14-95cc-92ee432293be {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:07:16.649595 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [req-3dee487d-685c-4db9-8fc2-916fab7db722 req-77452868-2b00-48e9-bdbc-dd4faed9f1bf service nova] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Received unexpected event network-vif-plugged-246226ea-1f45-4c14-95cc-92ee432293be for instance with vm_state active and task_state deleting. Aug 30 14:07:17.279334 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [-] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Updating instance_info_cache with network_info: [] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:07:17.297895 np0035104604 nova-compute[107505]: INFO nova.compute.manager [-] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Took 1.22 seconds to deallocate network for instance. Aug 30 14:07:17.324959 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.817s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:17.360450 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] rbd image bfa80c86-477c-4e04-96e3-d7fee174f726_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:07:17.368077 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:17.390203 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-80d9d554-cf3d-44c5-aba5-1e1e8d1c9a38 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:17.390952 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-80d9d554-cf3d-44c5-aba5-1e1e8d1c9a38 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:17.877705 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:17.904942 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-80d9d554-cf3d-44c5-aba5-1e1e8d1c9a38 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:18.099643 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.731s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] End _get_guest_xml xml= Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: bfa80c86-477c-4e04-96e3-d7fee174f726 Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: instance-00000006 Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: 131072 Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: 1 Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: tempest-ServersAdmin275Test-server-1173805866 Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: 2023-08-30 14:07:16 Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: 128 Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: 1 Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: 0 Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: 0 Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: 1 Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: tempest-ServersAdmin275Test-1203166925-project-member Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: tempest-ServersAdmin275Test-1203166925 Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: OpenStack Foundation Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: OpenStack Nova Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: 27.1.0 Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: bfa80c86-477c-4e04-96e3-d7fee174f726 Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: bfa80c86-477c-4e04-96e3-d7fee174f726 Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Virtual Machine Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: hvm Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Nehalem Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: /dev/urandom Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: Aug 30 14:07:18.104172 np0035104604 nova-compute[107505]: {{(pid=107505) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7559}} Aug 30 14:07:18.147823 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] No BDM found with device name vda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:07:18.148117 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] No BDM found with device name hda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:07:18.149385 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Using config drive Aug 30 14:07:18.190971 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] rbd image bfa80c86-477c-4e04-96e3-d7fee174f726_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:07:18.224954 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Lazy-loading 'ec2_ids' on Instance uuid bfa80c86-477c-4e04-96e3-d7fee174f726 {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:07:18.282546 np0035104604 nova-compute[107505]: DEBUG nova.objects.base [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Object Instance lazy-loaded attributes: vcpu_model,ec2_ids {{(pid=107505) wrapper /opt/stack/nova/nova/objects/base.py:126}} Aug 30 14:07:18.283160 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Lazy-loading 'keypairs' on Instance uuid bfa80c86-477c-4e04-96e3-d7fee174f726 {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:07:18.337048 np0035104604 nova-compute[107505]: DEBUG nova.objects.base [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Object Instance lazy-loaded attributes: vcpu_model,ec2_ids,keypairs {{(pid=107505) wrapper /opt/stack/nova/nova/objects/base.py:126}} Aug 30 14:07:18.543279 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Creating config drive at /opt/stack/data/nova/instances/bfa80c86-477c-4e04-96e3-d7fee174f726/disk.config Aug 30 14:07:18.550826 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Running cmd (subprocess): genisoimage -o /opt/stack/data/nova/instances/bfa80c86-477c-4e04-96e3-d7fee174f726/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 27.1.0 -quiet -J -r -V config-2 /tmp/tmpkn5ajrkb {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:18.589872 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] CMD "genisoimage -o /opt/stack/data/nova/instances/bfa80c86-477c-4e04-96e3-d7fee174f726/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 27.1.0 -quiet -J -r -V config-2 /tmp/tmpkn5ajrkb" returned: 0 in 0.038s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:18.643905 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] rbd image bfa80c86-477c-4e04-96e3-d7fee174f726_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:07:18.651004 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/bfa80c86-477c-4e04-96e3-d7fee174f726/disk.config bfa80c86-477c-4e04-96e3-d7fee174f726_disk.config --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:18.679251 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:18.693009 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-80d9d554-cf3d-44c5-aba5-1e1e8d1c9a38 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.788s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:18.699394 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-80d9d554-cf3d-44c5-aba5-1e1e8d1c9a38 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:07:18.716407 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-80d9d554-cf3d-44c5-aba5-1e1e8d1c9a38 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:07:18.752722 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-d1844954-3371-49da-9730-7109cf505840 req-27348d55-26e9-417c-918a-b72c1e6565ef service nova] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Received event network-vif-deleted-246226ea-1f45-4c14-95cc-92ee432293be {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:07:18.763786 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-80d9d554-cf3d-44c5-aba5-1e1e8d1c9a38 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.373s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:18.917318 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] CMD "rbd import --pool vms /opt/stack/data/nova/instances/bfa80c86-477c-4e04-96e3-d7fee174f726/disk.config bfa80c86-477c-4e04-96e3-d7fee174f726_disk.config --image-format=2 --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.267s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:18.917798 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Deleting local config drive /opt/stack/data/nova/instances/bfa80c86-477c-4e04-96e3-d7fee174f726/disk.config because it was imported into RBD. Aug 30 14:07:18.939212 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:18.987284 np0035104604 nova-compute[107505]: INFO nova.scheduler.client.report [None req-80d9d554-cf3d-44c5-aba5-1e1e8d1c9a38 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Deleted allocations for instance a9a74ed7-6eb9-4167-86a3-fe188c08af97 Aug 30 14:07:19.151453 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-80d9d554-cf3d-44c5-aba5-1e1e8d1c9a38 tempest-VolumesAssistedSnapshotsTest-74693406 tempest-VolumesAssistedSnapshotsTest-74693406-project-member] Lock "a9a74ed7-6eb9-4167-86a3-fe188c08af97" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 5.179s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:19.731546 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:19.881863 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Removed pending event for bfa80c86-477c-4e04-96e3-d7fee174f726 due to event {{(pid=107505) _event_emit_delayed /opt/stack/nova/nova/virt/libvirt/host.py:438}} Aug 30 14:07:19.881863 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Resumed> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:07:19.881863 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] VM Resumed (Lifecycle Event) Aug 30 14:07:19.881863 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Instance event wait completed in 0 seconds for {{(pid=107505) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Aug 30 14:07:19.881863 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Guest created on hypervisor {{(pid=107505) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4411}} Aug 30 14:07:19.884512 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Instance spawned successfully. Aug 30 14:07:19.885043 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:894}} Aug 30 14:07:19.902555 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:07:19.913862 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:07:19.926929 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Found default for hw_cdrom_bus of ide {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:07:19.927372 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Found default for hw_disk_bus of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:07:19.928035 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Found default for hw_input_bus of None {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:07:19.928765 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Found default for hw_pointer_model of None {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:07:19.928922 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Found default for hw_video_model of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:07:19.929479 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Found default for hw_vif_model of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:07:19.937377 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] During sync_power_state the instance has a pending task (rebuild_spawning). Skip. Aug 30 14:07:19.937885 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Started> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:07:19.938098 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] VM Started (Lifecycle Event) Aug 30 14:07:19.964889 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:07:19.979690 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:07:20.015640 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] During sync_power_state the instance has a pending task (rebuild_spawning). Skip. Aug 30 14:07:20.036915 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:07:20.142633 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:20.143462 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:20.143606 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Trying to apply a migration context that does not seem to be set for this instance {{(pid=107505) apply_migration_context /opt/stack/nova/nova/objects/instance.py:1078}} Aug 30 14:07:20.241794 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-b62e67bf-a6b8-4a3e-83bb-38d39ef66730 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.099s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:20.336585 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:22.722984 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Rebuilding instance Aug 30 14:07:23.229708 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:07:23.343854 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Acquiring lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:23.344787 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:23.375569 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Starting instance... {{(pid=107505) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} Aug 30 14:07:23.593758 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Instance destroyed successfully. Aug 30 14:07:23.602581 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:23.602968 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:23.609786 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Instance destroyed successfully. Aug 30 14:07:23.646330 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=107505) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Aug 30 14:07:23.646787 np0035104604 nova-compute[107505]: INFO nova.compute.claims [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Claim successful on node np0035104604 Aug 30 14:07:23.653617 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:24.077364 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Deleting instance files /opt/stack/data/nova/instances/bfa80c86-477c-4e04-96e3-d7fee174f726_del Aug 30 14:07:24.078530 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Deletion of /opt/stack/data/nova/instances/bfa80c86-477c-4e04-96e3-d7fee174f726_del complete Aug 30 14:07:24.249884 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:24.344295 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Creating instance directory {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4717}} Aug 30 14:07:24.344971 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Creating image(s) Aug 30 14:07:24.374361 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] rbd image bfa80c86-477c-4e04-96e3-d7fee174f726_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:07:24.405044 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] rbd image bfa80c86-477c-4e04-96e3-d7fee174f726_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:07:24.433525 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] rbd image bfa80c86-477c-4e04-96e3-d7fee174f726_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:07:24.440839 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e --force-share --output=json {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:24.546559 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] CMD "/opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e --force-share --output=json" returned: 0 in 0.105s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:24.547984 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] Acquiring lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:24.549930 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.002s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:24.550499 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:24.604812 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] rbd image bfa80c86-477c-4e04-96e3-d7fee174f726_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:07:24.610072 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e bfa80c86-477c-4e04-96e3-d7fee174f726_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:24.733956 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:24.906736 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] CMD "rbd import --pool vms /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e bfa80c86-477c-4e04-96e3-d7fee174f726_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.297s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:25.006058 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] resizing rbd image bfa80c86-477c-4e04-96e3-d7fee174f726_disk to 1073741824 {{(pid=107505) resize /opt/stack/nova/nova/storage/rbd_utils.py:288}} Aug 30 14:07:25.108829 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.859s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:25.116345 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Created local disks {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4851}} Aug 30 14:07:25.116770 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Ensure instance console log exists: /opt/stack/data/nova/instances/bfa80c86-477c-4e04-96e3-d7fee174f726/console.log {{(pid=107505) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4603}} Aug 30 14:07:25.117086 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:25.117476 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:25.117843 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:25.119491 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'ide', 'dev': 'hda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-08-30T14:01:32Z,direct_url=,disk_format='qcow2',id=a32d892d-d3ba-4b7e-9bab-7e06f730b9e0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='7f1ed4be4a61468fa9c662d2b5fbcb56',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-08-30T14:01:43Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'disk_bus': 'virtio', 'image_id': 'a32d892d-d3ba-4b7e-9bab-7e06f730b9e0'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=107505) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7553}} Aug 30 14:07:25.123367 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:07:25.128285 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError Aug 30 14:07:25.131767 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] Searching host: 'np0035104604' for CPU controller through CGroups V1... {{(pid=107505) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1650}} Aug 30 14:07:25.132845 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] CPU controller missing on host. {{(pid=107505) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1660}} Aug 30 14:07:25.135155 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] Searching host: 'np0035104604' for CPU controller through CGroups V2... {{(pid=107505) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1669}} Aug 30 14:07:25.135456 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] CPU controller found on host. {{(pid=107505) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1676}} Aug 30 14:07:25.137431 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=107505) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5390}} Aug 30 14:07:25.138357 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] Getting desirable topologies for flavor Flavor(created_at=2023-08-30T14:02:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-08-30T14:01:32Z,direct_url=,disk_format='qcow2',id=a32d892d-d3ba-4b7e-9bab-7e06f730b9e0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='7f1ed4be4a61468fa9c662d2b5fbcb56',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-08-30T14:01:43Z,virtual_size=,visibility=), allow threads: True {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Aug 30 14:07:25.138786 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] Flavor limits 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Aug 30 14:07:25.139191 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] Image limits 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Aug 30 14:07:25.139623 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] Flavor pref 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Aug 30 14:07:25.139961 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] Image pref 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Aug 30 14:07:25.140353 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Aug 30 14:07:25.140874 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Aug 30 14:07:25.141344 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=107505) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Aug 30 14:07:25.141778 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] Got 1 possible topologies {{(pid=107505) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Aug 30 14:07:25.142069 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Aug 30 14:07:25.142342 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Aug 30 14:07:25.143002 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] Lazy-loading 'vcpu_model' on Instance uuid bfa80c86-477c-4e04-96e3-d7fee174f726 {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:07:25.158777 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:07:25.203099 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:25.230129 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.626s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:25.232775 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Start building networks asynchronously for instance. {{(pid=107505) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} Aug 30 14:07:25.318559 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Allocating IP information in the background. {{(pid=107505) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} Aug 30 14:07:25.319298 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] allocate_for_instance() {{(pid=107505) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} Aug 30 14:07:25.441548 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Aug 30 14:07:25.462799 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Start building block device mappings for instance. {{(pid=107505) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} Aug 30 14:07:25.572215 np0035104604 nova-compute[107505]: DEBUG nova.policy [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5a6677b100d348b69575a22d363e18f4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b756ee34b70d4189a05bf0a0cb751e60', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=107505) authorize /opt/stack/nova/nova/policy.py:203}} Aug 30 14:07:25.747878 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Start spawning the instance on the hypervisor. {{(pid=107505) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} Aug 30 14:07:25.749472 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Creating instance directory {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4717}} Aug 30 14:07:25.750101 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Creating image(s) Aug 30 14:07:25.780976 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] rbd image 3e4eb957-d13b-4d8c-be3b-bad1d61114fe_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:07:25.832443 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] rbd image 3e4eb957-d13b-4d8c-be3b-bad1d61114fe_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:07:25.894304 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] rbd image 3e4eb957-d13b-4d8c-be3b-bad1d61114fe_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:07:25.899091 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e --force-share --output=json {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:25.934032 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Acquiring lock "8f0941f9-96cc-445a-9d3b-83aa415ff950" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:25.934442 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "8f0941f9-96cc-445a-9d3b-83aa415ff950" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:25.958068 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Starting instance... {{(pid=107505) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} Aug 30 14:07:25.983282 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.775s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:26.023038 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] rbd image bfa80c86-477c-4e04-96e3-d7fee174f726_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:07:26.027252 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:26.057680 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CMD "/opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e --force-share --output=json" returned: 0 in 0.158s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:26.058894 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Acquiring lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:26.059805 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:26.060298 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:26.093360 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] rbd image 3e4eb957-d13b-4d8c-be3b-bad1d61114fe_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:07:26.098849 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e 3e4eb957-d13b-4d8c-be3b-bad1d61114fe_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:26.350731 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Successfully created port: 614532f0-0086-4382-b9bd-08c309fc6548 {{(pid=107505) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} Aug 30 14:07:26.478791 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:26.479192 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:26.483004 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CMD "rbd import --pool vms /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e 3e4eb957-d13b-4d8c-be3b-bad1d61114fe_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.385s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:26.521371 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=107505) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Aug 30 14:07:26.521985 np0035104604 nova-compute[107505]: INFO nova.compute.claims [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Claim successful on node np0035104604 Aug 30 14:07:26.593753 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] resizing rbd image 3e4eb957-d13b-4d8c-be3b-bad1d61114fe_disk to 1073741824 {{(pid=107505) resize /opt/stack/nova/nova/storage/rbd_utils.py:288}} Aug 30 14:07:26.876280 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Created local disks {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4851}} Aug 30 14:07:26.876782 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Ensure instance console log exists: /opt/stack/data/nova/instances/3e4eb957-d13b-4d8c-be3b-bad1d61114fe/console.log {{(pid=107505) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4603}} Aug 30 14:07:26.877760 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:26.877939 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:26.878391 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:27.026024 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.998s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] End _get_guest_xml xml= Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: bfa80c86-477c-4e04-96e3-d7fee174f726 Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: instance-00000006 Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: 131072 Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: 1 Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: tempest-ServersAdmin275Test-server-1173805866 Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: 2023-08-30 14:07:25 Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: 128 Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: 1 Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: 0 Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: 0 Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: 1 Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: tempest-ServersAdmin275Test-1203166925-project-member Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: tempest-ServersAdmin275Test-1203166925 Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: OpenStack Foundation Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: OpenStack Nova Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: 27.1.0 Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: bfa80c86-477c-4e04-96e3-d7fee174f726 Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: bfa80c86-477c-4e04-96e3-d7fee174f726 Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Virtual Machine Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: hvm Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Nehalem Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: /dev/urandom Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: Aug 30 14:07:27.028808 np0035104604 nova-compute[107505]: {{(pid=107505) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7559}} Aug 30 14:07:27.225438 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Successfully updated port: 614532f0-0086-4382-b9bd-08c309fc6548 {{(pid=107505) _update_port /opt/stack/nova/nova/network/neutron.py:586}} Aug 30 14:07:27.230039 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:27.245364 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Acquiring lock "refresh_cache-3e4eb957-d13b-4d8c-be3b-bad1d61114fe" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:07:27.245576 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Acquired lock "refresh_cache-3e4eb957-d13b-4d8c-be3b-bad1d61114fe" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:07:27.246047 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Building network info cache for instance {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} Aug 30 14:07:27.249108 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-1cb61925-b00d-477d-b743-5abbec788ae3 req-664478e9-7e59-47f2-9c04-73a9821a5cad service nova] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Received event network-changed-614532f0-0086-4382-b9bd-08c309fc6548 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:07:27.249489 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-1cb61925-b00d-477d-b743-5abbec788ae3 req-664478e9-7e59-47f2-9c04-73a9821a5cad service nova] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Refreshing instance network info cache due to event network-changed-614532f0-0086-4382-b9bd-08c309fc6548. {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11037}} Aug 30 14:07:27.249920 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-1cb61925-b00d-477d-b743-5abbec788ae3 req-664478e9-7e59-47f2-9c04-73a9821a5cad service nova] Acquiring lock "refresh_cache-3e4eb957-d13b-4d8c-be3b-bad1d61114fe" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:07:27.575066 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:27.608566 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] No BDM found with device name vda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:07:27.609048 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] No BDM found with device name hda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:07:27.611329 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Using config drive Aug 30 14:07:27.645344 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] rbd image bfa80c86-477c-4e04-96e3-d7fee174f726_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:07:27.654876 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Instance cache missing network info. {{(pid=107505) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} Aug 30 14:07:27.780048 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] Lazy-loading 'ec2_ids' on Instance uuid bfa80c86-477c-4e04-96e3-d7fee174f726 {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:07:27.798118 np0035104604 nova-compute[107505]: DEBUG nova.objects.base [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] Object Instance lazy-loaded attributes: vcpu_model,ec2_ids {{(pid=107505) wrapper /opt/stack/nova/nova/objects/base.py:126}} Aug 30 14:07:27.798893 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] Lazy-loading 'keypairs' on Instance uuid bfa80c86-477c-4e04-96e3-d7fee174f726 {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:07:27.990490 np0035104604 nova-compute[107505]: DEBUG nova.objects.base [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] Object Instance lazy-loaded attributes: vcpu_model,ec2_ids,keypairs {{(pid=107505) wrapper /opt/stack/nova/nova/objects/base.py:126}} Aug 30 14:07:28.365160 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Creating config drive at /opt/stack/data/nova/instances/bfa80c86-477c-4e04-96e3-d7fee174f726/disk.config Aug 30 14:07:28.370685 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] Running cmd (subprocess): genisoimage -o /opt/stack/data/nova/instances/bfa80c86-477c-4e04-96e3-d7fee174f726/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 27.1.0 -quiet -J -r -V config-2 /tmp/tmp64edipmc {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:28.409998 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] CMD "genisoimage -o /opt/stack/data/nova/instances/bfa80c86-477c-4e04-96e3-d7fee174f726/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 27.1.0 -quiet -J -r -V config-2 /tmp/tmp64edipmc" returned: 0 in 0.038s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:28.574382 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] rbd image bfa80c86-477c-4e04-96e3-d7fee174f726_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:07:28.578458 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/bfa80c86-477c-4e04-96e3-d7fee174f726/disk.config bfa80c86-477c-4e04-96e3-d7fee174f726_disk.config --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:28.607474 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 1.032s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:28.613649 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:07:28.634229 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:07:28.652048 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:28.673025 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.194s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:28.674631 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Start building networks asynchronously for instance. {{(pid=107505) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} Aug 30 14:07:28.862532 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] CMD "rbd import --pool vms /opt/stack/data/nova/instances/bfa80c86-477c-4e04-96e3-d7fee174f726/disk.config bfa80c86-477c-4e04-96e3-d7fee174f726_disk.config --image-format=2 --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.284s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:28.863577 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Allocating IP information in the background. {{(pid=107505) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} Aug 30 14:07:28.863963 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] allocate_for_instance() {{(pid=107505) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} Aug 30 14:07:28.873856 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Deleting local config drive /opt/stack/data/nova/instances/bfa80c86-477c-4e04-96e3-d7fee174f726/disk.config because it was imported into RBD. Aug 30 14:07:29.012029 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Updating instance_info_cache with network_info: [{"id": "614532f0-0086-4382-b9bd-08c309fc6548", "address": "fa:16:3e:e5:63:35", "network": {"id": "089473b3-6053-4734-b86a-3935a1be492f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2061586110-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b756ee34b70d4189a05bf0a0cb751e60", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap614532f0-00", "ovs_interfaceid": "614532f0-0086-4382-b9bd-08c309fc6548", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:07:29.014545 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Aug 30 14:07:29.030052 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Releasing lock "refresh_cache-3e4eb957-d13b-4d8c-be3b-bad1d61114fe" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:07:29.030660 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Instance network_info: |[{"id": "614532f0-0086-4382-b9bd-08c309fc6548", "address": "fa:16:3e:e5:63:35", "network": {"id": "089473b3-6053-4734-b86a-3935a1be492f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2061586110-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b756ee34b70d4189a05bf0a0cb751e60", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap614532f0-00", "ovs_interfaceid": "614532f0-0086-4382-b9bd-08c309fc6548", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=107505) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} Aug 30 14:07:29.031320 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-1cb61925-b00d-477d-b743-5abbec788ae3 req-664478e9-7e59-47f2-9c04-73a9821a5cad service nova] Acquired lock "refresh_cache-3e4eb957-d13b-4d8c-be3b-bad1d61114fe" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:07:29.031785 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-1cb61925-b00d-477d-b743-5abbec788ae3 req-664478e9-7e59-47f2-9c04-73a9821a5cad service nova] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Refreshing network info cache for port 614532f0-0086-4382-b9bd-08c309fc6548 {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} Aug 30 14:07:29.038262 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Start _get_guest_xml network_info=[{"id": "614532f0-0086-4382-b9bd-08c309fc6548", "address": "fa:16:3e:e5:63:35", "network": {"id": "089473b3-6053-4734-b86a-3935a1be492f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2061586110-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b756ee34b70d4189a05bf0a0cb751e60", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap614532f0-00", "ovs_interfaceid": "614532f0-0086-4382-b9bd-08c309fc6548", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'ide', 'dev': 'hda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-08-30T14:01:32Z,direct_url=,disk_format='qcow2',id=a32d892d-d3ba-4b7e-9bab-7e06f730b9e0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='7f1ed4be4a61468fa9c662d2b5fbcb56',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-08-30T14:01:43Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'disk_bus': 'virtio', 'image_id': 'a32d892d-d3ba-4b7e-9bab-7e06f730b9e0'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=107505) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7553}} Aug 30 14:07:29.043937 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Start building block device mappings for instance. {{(pid=107505) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} Aug 30 14:07:29.201228 np0035104604 nova-compute[107505]: DEBUG nova.policy [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5a6677b100d348b69575a22d363e18f4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b756ee34b70d4189a05bf0a0cb751e60', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=107505) authorize /opt/stack/nova/nova/policy.py:203}} Aug 30 14:07:29.354971 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Aug 30 14:07:29.365757 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Start spawning the instance on the hypervisor. {{(pid=107505) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} Aug 30 14:07:29.366518 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Creating instance directory {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4717}} Aug 30 14:07:29.366942 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Creating image(s) Aug 30 14:07:29.412337 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] rbd image 8f0941f9-96cc-445a-9d3b-83aa415ff950_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:07:29.447683 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] rbd image 8f0941f9-96cc-445a-9d3b-83aa415ff950_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:07:29.486580 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] rbd image 8f0941f9-96cc-445a-9d3b-83aa415ff950_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:07:29.490425 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e --force-share --output=json {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:29.657062 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CMD "/opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e --force-share --output=json" returned: 0 in 0.166s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:29.658023 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Acquiring lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:29.659111 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:29.659668 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:29.692578 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] rbd image 8f0941f9-96cc-445a-9d3b-83aa415ff950_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:07:29.697569 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e 8f0941f9-96cc-445a-9d3b-83aa415ff950_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:29.719783 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Searching host: 'np0035104604' for CPU controller through CGroups V1... {{(pid=107505) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1650}} Aug 30 14:07:29.720555 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CPU controller missing on host. {{(pid=107505) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1660}} Aug 30 14:07:29.724165 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:07:29.724440 np0035104604 nova-compute[107505]: INFO nova.compute.manager [-] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] VM Stopped (Lifecycle Event) Aug 30 14:07:29.730945 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Searching host: 'np0035104604' for CPU controller through CGroups V2... {{(pid=107505) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1669}} Aug 30 14:07:29.731500 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CPU controller found on host. {{(pid=107505) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1676}} Aug 30 14:07:29.734223 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=107505) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5390}} Aug 30 14:07:29.735213 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-08-30T14:02:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-08-30T14:01:32Z,direct_url=,disk_format='qcow2',id=a32d892d-d3ba-4b7e-9bab-7e06f730b9e0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='7f1ed4be4a61468fa9c662d2b5fbcb56',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-08-30T14:01:43Z,virtual_size=,visibility=), allow threads: True {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Aug 30 14:07:29.735719 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Flavor limits 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Aug 30 14:07:29.736327 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Image limits 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Aug 30 14:07:29.736652 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Flavor pref 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Aug 30 14:07:29.737152 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Image pref 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Aug 30 14:07:29.737831 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Aug 30 14:07:29.739928 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Aug 30 14:07:29.739928 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=107505) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Aug 30 14:07:29.739928 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Got 1 possible topologies {{(pid=107505) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Aug 30 14:07:29.739928 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Aug 30 14:07:29.740447 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Aug 30 14:07:29.773784 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:29.803147 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:29.804924 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Removed pending event for bfa80c86-477c-4e04-96e3-d7fee174f726 due to event {{(pid=107505) _event_emit_delayed /opt/stack/nova/nova/virt/libvirt/host.py:438}} Aug 30 14:07:29.805398 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Resumed> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:07:29.805971 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] VM Resumed (Lifecycle Event) Aug 30 14:07:29.809800 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Instance event wait completed in 0 seconds for {{(pid=107505) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Aug 30 14:07:29.810375 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Guest created on hypervisor {{(pid=107505) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4411}} Aug 30 14:07:29.812092 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-1a490111-17cd-4f0c-907e-d9e0fdc0e5ae None None] [instance: a9a74ed7-6eb9-4167-86a3-fe188c08af97] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:07:29.963410 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:07:29.981794 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Instance spawned successfully. Aug 30 14:07:29.981794 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:894}} Aug 30 14:07:29.984192 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:07:30.008069 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] During sync_power_state the instance has a pending task (rebuild_spawning). Skip. Aug 30 14:07:30.008830 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Started> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:07:30.009090 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] VM Started (Lifecycle Event) Aug 30 14:07:30.154695 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Found default for hw_cdrom_bus of ide {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:07:30.155596 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Found default for hw_disk_bus of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:07:30.156790 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Found default for hw_input_bus of None {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:07:30.157402 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Found default for hw_pointer_model of None {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:07:30.158330 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Found default for hw_video_model of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:07:30.159666 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Found default for hw_vif_model of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:07:30.182245 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Successfully created port: 3195884f-aa65-4b41-af9d-9f8d326dbc95 {{(pid=107505) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} Aug 30 14:07:30.191850 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:07:30.379292 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CMD "rbd import --pool vms /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e 8f0941f9-96cc-445a-9d3b-83aa415ff950_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.682s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:30.434789 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:07:30.639446 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.869s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:30.672248 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] rbd image 3e4eb957-d13b-4d8c-be3b-bad1d61114fe_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:07:30.678007 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:30.702356 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:07:30.720654 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] resizing rbd image 8f0941f9-96cc-445a-9d3b-83aa415ff950_disk to 1073741824 {{(pid=107505) resize /opt/stack/nova/nova/storage/rbd_utils.py:288}} Aug 30 14:07:30.767737 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] During sync_power_state the instance has a pending task (rebuild_spawning). Skip. Aug 30 14:07:30.882462 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:30.883012 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:30.883434 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Trying to apply a migration context that does not seem to be set for this instance {{(pid=107505) apply_migration_context /opt/stack/nova/nova/objects/instance.py:1078}} Aug 30 14:07:30.899478 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Created local disks {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4851}} Aug 30 14:07:30.899983 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Ensure instance console log exists: /opt/stack/data/nova/instances/8f0941f9-96cc-445a-9d3b-83aa415ff950/console.log {{(pid=107505) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4603}} Aug 30 14:07:30.900709 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:30.901222 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:30.901719 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:31.109594 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:31.133140 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-ed3e83ea-7968-48db-97f5-249ddd271591 tempest-ServersAdmin275Test-1859188348 tempest-ServersAdmin275Test-1859188348-project-admin] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.250s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:31.386722 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Successfully updated port: 3195884f-aa65-4b41-af9d-9f8d326dbc95 {{(pid=107505) _update_port /opt/stack/nova/nova/network/neutron.py:586}} Aug 30 14:07:31.402511 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-ed3b538e-61ec-4132-877d-c54d57465ef7 req-d32fffee-4336-42d2-92a3-3ff2b79fc445 service nova] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Received event network-changed-3195884f-aa65-4b41-af9d-9f8d326dbc95 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:07:31.402941 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-ed3b538e-61ec-4132-877d-c54d57465ef7 req-d32fffee-4336-42d2-92a3-3ff2b79fc445 service nova] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Refreshing instance network info cache due to event network-changed-3195884f-aa65-4b41-af9d-9f8d326dbc95. {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11037}} Aug 30 14:07:31.403551 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-ed3b538e-61ec-4132-877d-c54d57465ef7 req-d32fffee-4336-42d2-92a3-3ff2b79fc445 service nova] Acquiring lock "refresh_cache-8f0941f9-96cc-445a-9d3b-83aa415ff950" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:07:31.403961 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-ed3b538e-61ec-4132-877d-c54d57465ef7 req-d32fffee-4336-42d2-92a3-3ff2b79fc445 service nova] Acquired lock "refresh_cache-8f0941f9-96cc-445a-9d3b-83aa415ff950" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:07:31.404461 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-ed3b538e-61ec-4132-877d-c54d57465ef7 req-d32fffee-4336-42d2-92a3-3ff2b79fc445 service nova] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Refreshing network info cache for port 3195884f-aa65-4b41-af9d-9f8d326dbc95 {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} Aug 30 14:07:31.416498 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Acquiring lock "refresh_cache-8f0941f9-96cc-445a-9d3b-83aa415ff950" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:07:31.557169 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-1cb61925-b00d-477d-b743-5abbec788ae3 req-664478e9-7e59-47f2-9c04-73a9821a5cad service nova] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Updated VIF entry in instance network info cache for port 614532f0-0086-4382-b9bd-08c309fc6548. {{(pid=107505) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} Aug 30 14:07:31.557841 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-1cb61925-b00d-477d-b743-5abbec788ae3 req-664478e9-7e59-47f2-9c04-73a9821a5cad service nova] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Updating instance_info_cache with network_info: [{"id": "614532f0-0086-4382-b9bd-08c309fc6548", "address": "fa:16:3e:e5:63:35", "network": {"id": "089473b3-6053-4734-b86a-3935a1be492f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2061586110-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b756ee34b70d4189a05bf0a0cb751e60", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap614532f0-00", "ovs_interfaceid": "614532f0-0086-4382-b9bd-08c309fc6548", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:07:31.574283 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.897s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:31.576810 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-08-30T14:07:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1551096224',display_name='tempest-ServersAdminTestJSON-server-1551096224',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-serversadmintestjson-server-1551096224',id=7,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b756ee34b70d4189a05bf0a0cb751e60',ramdisk_id='',reservation_id='r-mntu6p6e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-499476127',owner_user_name='tempest-ServersAdminTestJSON-499476127-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-08-30T14:07:26Z,user_data=None,user_id='5a6677b100d348b69575a22d363e18f4',uuid=3e4eb957-d13b-4d8c-be3b-bad1d61114fe,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "614532f0-0086-4382-b9bd-08c309fc6548", "address": "fa:16:3e:e5:63:35", "network": {"id": "089473b3-6053-4734-b86a-3935a1be492f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2061586110-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b756ee34b70d4189a05bf0a0cb751e60", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap614532f0-00", "ovs_interfaceid": "614532f0-0086-4382-b9bd-08c309fc6548", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=qemu {{(pid=107505) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Aug 30 14:07:31.577315 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Converting VIF {"id": "614532f0-0086-4382-b9bd-08c309fc6548", "address": "fa:16:3e:e5:63:35", "network": {"id": "089473b3-6053-4734-b86a-3935a1be492f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2061586110-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b756ee34b70d4189a05bf0a0cb751e60", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap614532f0-00", "ovs_interfaceid": "614532f0-0086-4382-b9bd-08c309fc6548", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:07:31.579019 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:63:35,bridge_name='br-int',has_traffic_filtering=True,id=614532f0-0086-4382-b9bd-08c309fc6548,network=Network(089473b3-6053-4734-b86a-3935a1be492f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap614532f0-00') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:07:31.579915 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lazy-loading 'pci_devices' on Instance uuid 3e4eb957-d13b-4d8c-be3b-bad1d61114fe {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:07:31.582093 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-1cb61925-b00d-477d-b743-5abbec788ae3 req-664478e9-7e59-47f2-9c04-73a9821a5cad service nova] Releasing lock "refresh_cache-3e4eb957-d13b-4d8c-be3b-bad1d61114fe" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] End _get_guest_xml xml= Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: instance-00000007 Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: 131072 Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: 1 Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: tempest-ServersAdminTestJSON-server-1551096224 Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: 2023-08-30 14:07:29 Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: 128 Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: 1 Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: 0 Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: 0 Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: 1 Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: tempest-ServersAdminTestJSON-499476127-project-member Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: tempest-ServersAdminTestJSON-499476127 Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: OpenStack Foundation Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: OpenStack Nova Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: 27.1.0 Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Virtual Machine Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: hvm Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Nehalem Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: /dev/urandom Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: Aug 30 14:07:31.598741 np0035104604 nova-compute[107505]: {{(pid=107505) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7559}} Aug 30 14:07:31.609780 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Preparing to wait for external event network-vif-plugged-614532f0-0086-4382-b9bd-08c309fc6548 {{(pid=107505) prepare_for_instance_event /opt/stack/nova/nova/compute/manager.py:283}} Aug 30 14:07:31.609780 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Acquiring lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:31.609780 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:31.609780 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:31.609780 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-08-30T14:07:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1551096224',display_name='tempest-ServersAdminTestJSON-server-1551096224',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-serversadmintestjson-server-1551096224',id=7,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b756ee34b70d4189a05bf0a0cb751e60',ramdisk_id='',reservation_id='r-mntu6p6e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-499476127',owner_user_name='tempest-ServersAdminTestJSON-499476127-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-08-30T14:07:26Z,user_data=None,user_id='5a6677b100d348b69575a22d363e18f4',uuid=3e4eb957-d13b-4d8c-be3b-bad1d61114fe,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "614532f0-0086-4382-b9bd-08c309fc6548", "address": "fa:16:3e:e5:63:35", "network": {"id": "089473b3-6053-4734-b86a-3935a1be492f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2061586110-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b756ee34b70d4189a05bf0a0cb751e60", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap614532f0-00", "ovs_interfaceid": "614532f0-0086-4382-b9bd-08c309fc6548", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Aug 30 14:07:31.612386 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Converting VIF {"id": "614532f0-0086-4382-b9bd-08c309fc6548", "address": "fa:16:3e:e5:63:35", "network": {"id": "089473b3-6053-4734-b86a-3935a1be492f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2061586110-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b756ee34b70d4189a05bf0a0cb751e60", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap614532f0-00", "ovs_interfaceid": "614532f0-0086-4382-b9bd-08c309fc6548", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:07:31.612386 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:63:35,bridge_name='br-int',has_traffic_filtering=True,id=614532f0-0086-4382-b9bd-08c309fc6548,network=Network(089473b3-6053-4734-b86a-3935a1be492f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap614532f0-00') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:07:31.612386 np0035104604 nova-compute[107505]: DEBUG os_vif [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:63:35,bridge_name='br-int',has_traffic_filtering=True,id=614532f0-0086-4382-b9bd-08c309fc6548,network=Network(089473b3-6053-4734-b86a-3935a1be492f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap614532f0-00') {{(pid=107505) plug /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:76}} Aug 30 14:07:31.612386 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:31.612386 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:07:31.612386 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Aug 30 14:07:31.619410 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:31.619885 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap614532f0-00, may_exist=True, interface_attrs={}) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:07:31.620785 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap614532f0-00, col_values=(('external_ids', {'iface-id': '614532f0-0086-4382-b9bd-08c309fc6548', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e5:63:35', 'vm-uuid': '3e4eb957-d13b-4d8c-be3b-bad1d61114fe'}),), if_exists=True) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:07:31.623313 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:31.629563 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:31.632028 np0035104604 nova-compute[107505]: INFO os_vif [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:63:35,bridge_name='br-int',has_traffic_filtering=True,id=614532f0-0086-4382-b9bd-08c309fc6548,network=Network(089473b3-6053-4734-b86a-3935a1be492f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap614532f0-00') Aug 30 14:07:31.638338 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-ed3b538e-61ec-4132-877d-c54d57465ef7 req-d32fffee-4336-42d2-92a3-3ff2b79fc445 service nova] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Instance cache missing network info. {{(pid=107505) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} Aug 30 14:07:31.820166 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-7dd21448-cf66-42e1-9286-0f6fdeb5dab2 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Acquiring lock "bfa80c86-477c-4e04-96e3-d7fee174f726" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:31.820724 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-7dd21448-cf66-42e1-9286-0f6fdeb5dab2 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Lock "bfa80c86-477c-4e04-96e3-d7fee174f726" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:31.821107 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-7dd21448-cf66-42e1-9286-0f6fdeb5dab2 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Acquiring lock "bfa80c86-477c-4e04-96e3-d7fee174f726-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:31.821537 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-7dd21448-cf66-42e1-9286-0f6fdeb5dab2 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Lock "bfa80c86-477c-4e04-96e3-d7fee174f726-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:31.821976 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-7dd21448-cf66-42e1-9286-0f6fdeb5dab2 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Lock "bfa80c86-477c-4e04-96e3-d7fee174f726-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:31.826278 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-7dd21448-cf66-42e1-9286-0f6fdeb5dab2 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Terminating instance Aug 30 14:07:31.829422 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-7dd21448-cf66-42e1-9286-0f6fdeb5dab2 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Acquiring lock "refresh_cache-bfa80c86-477c-4e04-96e3-d7fee174f726" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:07:31.829823 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-7dd21448-cf66-42e1-9286-0f6fdeb5dab2 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Acquired lock "refresh_cache-bfa80c86-477c-4e04-96e3-d7fee174f726" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:07:31.830227 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-7dd21448-cf66-42e1-9286-0f6fdeb5dab2 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Building network info cache for instance {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} Aug 30 14:07:31.837644 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] No BDM found with device name vda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:07:31.838334 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] No BDM found with device name hda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:07:31.838697 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] No VIF found with MAC fa:16:3e:e5:63:35, not building metadata {{(pid=107505) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12068}} Aug 30 14:07:31.839482 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Using config drive Aug 30 14:07:31.869475 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] rbd image 3e4eb957-d13b-4d8c-be3b-bad1d61114fe_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:07:32.292295 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-7dd21448-cf66-42e1-9286-0f6fdeb5dab2 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Instance cache missing network info. {{(pid=107505) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} Aug 30 14:07:32.418568 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-ed3b538e-61ec-4132-877d-c54d57465ef7 req-d32fffee-4336-42d2-92a3-3ff2b79fc445 service nova] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Updating instance_info_cache with network_info: [] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:07:32.435739 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-ed3b538e-61ec-4132-877d-c54d57465ef7 req-d32fffee-4336-42d2-92a3-3ff2b79fc445 service nova] Releasing lock "refresh_cache-8f0941f9-96cc-445a-9d3b-83aa415ff950" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:07:32.437080 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Acquired lock "refresh_cache-8f0941f9-96cc-445a-9d3b-83aa415ff950" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:07:32.437615 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Building network info cache for instance {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} Aug 30 14:07:32.856955 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Instance cache missing network info. {{(pid=107505) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} Aug 30 14:07:32.973500 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-7dd21448-cf66-42e1-9286-0f6fdeb5dab2 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Updating instance_info_cache with network_info: [] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:07:32.990771 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Creating config drive at /opt/stack/data/nova/instances/3e4eb957-d13b-4d8c-be3b-bad1d61114fe/disk.config Aug 30 14:07:32.995862 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Running cmd (subprocess): genisoimage -o /opt/stack/data/nova/instances/3e4eb957-d13b-4d8c-be3b-bad1d61114fe/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 27.1.0 -quiet -J -r -V config-2 /tmp/tmpzseb88q4 {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:33.026178 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-7dd21448-cf66-42e1-9286-0f6fdeb5dab2 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Releasing lock "refresh_cache-bfa80c86-477c-4e04-96e3-d7fee174f726" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:07:33.027104 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-7dd21448-cf66-42e1-9286-0f6fdeb5dab2 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Start destroying the instance on the hypervisor. {{(pid=107505) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} Aug 30 14:07:33.029904 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CMD "genisoimage -o /opt/stack/data/nova/instances/3e4eb957-d13b-4d8c-be3b-bad1d61114fe/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 27.1.0 -quiet -J -r -V config-2 /tmp/tmpzseb88q4" returned: 0 in 0.033s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:33.061271 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] rbd image 3e4eb957-d13b-4d8c-be3b-bad1d61114fe_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:07:33.064633 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/3e4eb957-d13b-4d8c-be3b-bad1d61114fe/disk.config 3e4eb957-d13b-4d8c-be3b-bad1d61114fe_disk.config --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:33.240225 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CMD "rbd import --pool vms /opt/stack/data/nova/instances/3e4eb957-d13b-4d8c-be3b-bad1d61114fe/disk.config 3e4eb957-d13b-4d8c-be3b-bad1d61114fe_disk.config --image-format=2 --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.175s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:33.240683 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Deleting local config drive /opt/stack/data/nova/instances/3e4eb957-d13b-4d8c-be3b-bad1d61114fe/disk.config because it was imported into RBD. Aug 30 14:07:33.271221 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:33.294112 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:33.502399 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Instance destroyed successfully. Aug 30 14:07:33.503155 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-7dd21448-cf66-42e1-9286-0f6fdeb5dab2 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Lazy-loading 'resources' on Instance uuid bfa80c86-477c-4e04-96e3-d7fee174f726 {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:07:33.541869 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-a674268b-bde6-4ada-90b5-c5fd14f60eda req-9fc65fb3-4e27-4c81-b409-4db7812d1ef8 service nova] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Received event network-vif-plugged-614532f0-0086-4382-b9bd-08c309fc6548 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:07:33.542606 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-a674268b-bde6-4ada-90b5-c5fd14f60eda req-9fc65fb3-4e27-4c81-b409-4db7812d1ef8 service nova] Acquiring lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:33.545775 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-a674268b-bde6-4ada-90b5-c5fd14f60eda req-9fc65fb3-4e27-4c81-b409-4db7812d1ef8 service nova] Lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:33.545775 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-a674268b-bde6-4ada-90b5-c5fd14f60eda req-9fc65fb3-4e27-4c81-b409-4db7812d1ef8 service nova] Lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:33.545775 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-a674268b-bde6-4ada-90b5-c5fd14f60eda req-9fc65fb3-4e27-4c81-b409-4db7812d1ef8 service nova] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Processing event network-vif-plugged-614532f0-0086-4382-b9bd-08c309fc6548 {{(pid=107505) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10792}} Aug 30 14:07:33.724796 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:33.768187 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:33.781290 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:33.810697 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:34.103162 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Updating instance_info_cache with network_info: [{"id": "3195884f-aa65-4b41-af9d-9f8d326dbc95", "address": "fa:16:3e:41:92:45", "network": {"id": "089473b3-6053-4734-b86a-3935a1be492f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2061586110-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b756ee34b70d4189a05bf0a0cb751e60", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3195884f-aa", "ovs_interfaceid": "3195884f-aa65-4b41-af9d-9f8d326dbc95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:07:34.126920 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Releasing lock "refresh_cache-8f0941f9-96cc-445a-9d3b-83aa415ff950" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:07:34.127327 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Instance network_info: |[{"id": "3195884f-aa65-4b41-af9d-9f8d326dbc95", "address": "fa:16:3e:41:92:45", "network": {"id": "089473b3-6053-4734-b86a-3935a1be492f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2061586110-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b756ee34b70d4189a05bf0a0cb751e60", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3195884f-aa", "ovs_interfaceid": "3195884f-aa65-4b41-af9d-9f8d326dbc95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=107505) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} Aug 30 14:07:34.131822 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Start _get_guest_xml network_info=[{"id": "3195884f-aa65-4b41-af9d-9f8d326dbc95", "address": "fa:16:3e:41:92:45", "network": {"id": "089473b3-6053-4734-b86a-3935a1be492f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2061586110-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b756ee34b70d4189a05bf0a0cb751e60", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3195884f-aa", "ovs_interfaceid": "3195884f-aa65-4b41-af9d-9f8d326dbc95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'ide', 'dev': 'hda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-08-30T14:01:32Z,direct_url=,disk_format='qcow2',id=a32d892d-d3ba-4b7e-9bab-7e06f730b9e0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='7f1ed4be4a61468fa9c662d2b5fbcb56',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-08-30T14:01:43Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'disk_bus': 'virtio', 'image_id': 'a32d892d-d3ba-4b7e-9bab-7e06f730b9e0'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=107505) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7553}} Aug 30 14:07:34.138176 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Aug 30 14:07:34.141082 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Searching host: 'np0035104604' for CPU controller through CGroups V1... {{(pid=107505) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1650}} Aug 30 14:07:34.141628 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CPU controller missing on host. {{(pid=107505) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1660}} Aug 30 14:07:34.143592 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Searching host: 'np0035104604' for CPU controller through CGroups V2... {{(pid=107505) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1669}} Aug 30 14:07:34.144102 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CPU controller found on host. {{(pid=107505) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1676}} Aug 30 14:07:34.147019 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=107505) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5390}} Aug 30 14:07:34.148526 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-08-30T14:02:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-08-30T14:01:32Z,direct_url=,disk_format='qcow2',id=a32d892d-d3ba-4b7e-9bab-7e06f730b9e0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='7f1ed4be4a61468fa9c662d2b5fbcb56',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-08-30T14:01:43Z,virtual_size=,visibility=), allow threads: True {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Aug 30 14:07:34.149448 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Flavor limits 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Aug 30 14:07:34.149901 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Image limits 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Aug 30 14:07:34.150609 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Flavor pref 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Aug 30 14:07:34.151003 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Image pref 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Aug 30 14:07:34.151399 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Aug 30 14:07:34.152257 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Aug 30 14:07:34.152728 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=107505) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Aug 30 14:07:34.153230 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Got 1 possible topologies {{(pid=107505) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Aug 30 14:07:34.153968 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Aug 30 14:07:34.154315 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Aug 30 14:07:34.169638 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:34.197371 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-7dd21448-cf66-42e1-9286-0f6fdeb5dab2 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Deleting instance files /opt/stack/data/nova/instances/bfa80c86-477c-4e04-96e3-d7fee174f726_del Aug 30 14:07:34.198310 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-7dd21448-cf66-42e1-9286-0f6fdeb5dab2 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Deletion of /opt/stack/data/nova/instances/bfa80c86-477c-4e04-96e3-d7fee174f726_del complete Aug 30 14:07:34.233669 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Instance event wait completed in 0 seconds for network-vif-plugged {{(pid=107505) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Aug 30 14:07:34.234482 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Started> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:07:34.234745 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] VM Started (Lifecycle Event) Aug 30 14:07:34.270538 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Guest created on hypervisor {{(pid=107505) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4411}} Aug 30 14:07:34.276876 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:07:34.282714 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Instance spawned successfully. Aug 30 14:07:34.283223 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:894}} Aug 30 14:07:34.286465 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:07:34.295206 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-7dd21448-cf66-42e1-9286-0f6fdeb5dab2 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Took 1.27 seconds to destroy the instance on the hypervisor. Aug 30 14:07:34.295881 np0035104604 nova-compute[107505]: DEBUG oslo.service.loopingcall [None req-7dd21448-cf66-42e1-9286-0f6fdeb5dab2 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=107505) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} Aug 30 14:07:34.296303 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [-] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Deallocating network for instance {{(pid=107505) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} Aug 30 14:07:34.296456 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [-] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] deallocate_for_instance() {{(pid=107505) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} Aug 30 14:07:34.445192 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] During sync_power_state the instance has a pending task (spawning). Skip. Aug 30 14:07:34.446077 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Paused> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:07:34.446411 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] VM Paused (Lifecycle Event) Aug 30 14:07:34.460675 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Found default for hw_cdrom_bus of ide {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:07:34.461185 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Found default for hw_disk_bus of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:07:34.462223 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Found default for hw_input_bus of None {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:07:34.462991 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Found default for hw_pointer_model of None {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:07:34.463861 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Found default for hw_video_model of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:07:34.464686 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Found default for hw_vif_model of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:07:34.486566 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:07:34.493989 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Resumed> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:07:34.494442 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] VM Resumed (Lifecycle Event) Aug 30 14:07:34.498831 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [-] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Instance cache missing network info. {{(pid=107505) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} Aug 30 14:07:34.517303 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [-] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Updating instance_info_cache with network_info: [] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:07:34.527108 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:07:34.533233 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:07:34.545593 np0035104604 nova-compute[107505]: INFO nova.compute.manager [-] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Took 0.25 seconds to deallocate network for instance. Aug 30 14:07:34.572300 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] During sync_power_state the instance has a pending task (spawning). Skip. Aug 30 14:07:34.626674 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Took 8.88 seconds to spawn the instance on the hypervisor. Aug 30 14:07:34.627496 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:07:34.663835 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-7dd21448-cf66-42e1-9286-0f6fdeb5dab2 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:34.665460 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-7dd21448-cf66-42e1-9286-0f6fdeb5dab2 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:34.810992 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Took 11.37 seconds to build instance. Aug 30 14:07:34.839817 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-eb67206d-02a1-47ca-8437-271f1bba9e7b tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 11.495s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:34.965573 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.796s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:34.988565 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] rbd image 8f0941f9-96cc-445a-9d3b-83aa415ff950_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:07:34.992051 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:35.313610 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-7dd21448-cf66-42e1-9286-0f6fdeb5dab2 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:35.600832 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-984d3040-45c7-4541-bda3-655cc8a1acd7 req-a7edfe93-d3c3-4dc4-94fa-c8482a6ec97f service nova] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Received event network-vif-plugged-614532f0-0086-4382-b9bd-08c309fc6548 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:07:35.601492 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-984d3040-45c7-4541-bda3-655cc8a1acd7 req-a7edfe93-d3c3-4dc4-94fa-c8482a6ec97f service nova] Acquiring lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:35.602418 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-984d3040-45c7-4541-bda3-655cc8a1acd7 req-a7edfe93-d3c3-4dc4-94fa-c8482a6ec97f service nova] Lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:35.604968 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-984d3040-45c7-4541-bda3-655cc8a1acd7 req-a7edfe93-d3c3-4dc4-94fa-c8482a6ec97f service nova] Lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.003s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:35.605605 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-984d3040-45c7-4541-bda3-655cc8a1acd7 req-a7edfe93-d3c3-4dc4-94fa-c8482a6ec97f service nova] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] No waiting events found dispatching network-vif-plugged-614532f0-0086-4382-b9bd-08c309fc6548 {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:07:35.606482 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [req-984d3040-45c7-4541-bda3-655cc8a1acd7 req-a7edfe93-d3c3-4dc4-94fa-c8482a6ec97f service nova] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Received unexpected event network-vif-plugged-614532f0-0086-4382-b9bd-08c309fc6548 for instance with vm_state active and task_state None. Aug 30 14:07:35.698477 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.706s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:35.702055 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-08-30T14:07:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-566464463',display_name='tempest-ServersAdminTestJSON-server-566464463',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-serversadmintestjson-server-566464463',id=8,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b756ee34b70d4189a05bf0a0cb751e60',ramdisk_id='',reservation_id='r-g8rakioj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-499476127',owner_user_name='tempest-ServersAdminTestJSON-499476127-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-08-30T14:07:29Z,user_data=None,user_id='5a6677b100d348b69575a22d363e18f4',uuid=8f0941f9-96cc-445a-9d3b-83aa415ff950,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3195884f-aa65-4b41-af9d-9f8d326dbc95", "address": "fa:16:3e:41:92:45", "network": {"id": "089473b3-6053-4734-b86a-3935a1be492f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2061586110-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b756ee34b70d4189a05bf0a0cb751e60", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3195884f-aa", "ovs_interfaceid": "3195884f-aa65-4b41-af9d-9f8d326dbc95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=qemu {{(pid=107505) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Aug 30 14:07:35.703035 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Converting VIF {"id": "3195884f-aa65-4b41-af9d-9f8d326dbc95", "address": "fa:16:3e:41:92:45", "network": {"id": "089473b3-6053-4734-b86a-3935a1be492f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2061586110-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b756ee34b70d4189a05bf0a0cb751e60", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3195884f-aa", "ovs_interfaceid": "3195884f-aa65-4b41-af9d-9f8d326dbc95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:07:35.704691 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:41:92:45,bridge_name='br-int',has_traffic_filtering=True,id=3195884f-aa65-4b41-af9d-9f8d326dbc95,network=Network(089473b3-6053-4734-b86a-3935a1be492f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3195884f-aa') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:07:35.706875 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lazy-loading 'pci_devices' on Instance uuid 8f0941f9-96cc-445a-9d3b-83aa415ff950 {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] End _get_guest_xml xml= Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: 8f0941f9-96cc-445a-9d3b-83aa415ff950 Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: instance-00000008 Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: 131072 Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: 1 Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: tempest-ServersAdminTestJSON-server-566464463 Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: 2023-08-30 14:07:34 Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: 128 Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: 1 Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: 0 Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: 0 Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: 1 Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: tempest-ServersAdminTestJSON-499476127-project-member Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: tempest-ServersAdminTestJSON-499476127 Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: OpenStack Foundation Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: OpenStack Nova Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: 27.1.0 Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: 8f0941f9-96cc-445a-9d3b-83aa415ff950 Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: 8f0941f9-96cc-445a-9d3b-83aa415ff950 Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Virtual Machine Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: hvm Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Nehalem Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: /dev/urandom Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: Aug 30 14:07:35.721996 np0035104604 nova-compute[107505]: {{(pid=107505) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7559}} Aug 30 14:07:35.727553 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Preparing to wait for external event network-vif-plugged-3195884f-aa65-4b41-af9d-9f8d326dbc95 {{(pid=107505) prepare_for_instance_event /opt/stack/nova/nova/compute/manager.py:283}} Aug 30 14:07:35.727980 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Acquiring lock "8f0941f9-96cc-445a-9d3b-83aa415ff950-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:35.732781 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "8f0941f9-96cc-445a-9d3b-83aa415ff950-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:35.734105 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "8f0941f9-96cc-445a-9d3b-83aa415ff950-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: held 0.005s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:35.735358 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-08-30T14:07:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-566464463',display_name='tempest-ServersAdminTestJSON-server-566464463',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-serversadmintestjson-server-566464463',id=8,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b756ee34b70d4189a05bf0a0cb751e60',ramdisk_id='',reservation_id='r-g8rakioj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-499476127',owner_user_name='tempest-ServersAdminTestJSON-499476127-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-08-30T14:07:29Z,user_data=None,user_id='5a6677b100d348b69575a22d363e18f4',uuid=8f0941f9-96cc-445a-9d3b-83aa415ff950,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3195884f-aa65-4b41-af9d-9f8d326dbc95", "address": "fa:16:3e:41:92:45", "network": {"id": "089473b3-6053-4734-b86a-3935a1be492f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2061586110-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b756ee34b70d4189a05bf0a0cb751e60", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3195884f-aa", "ovs_interfaceid": "3195884f-aa65-4b41-af9d-9f8d326dbc95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Aug 30 14:07:35.736025 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Converting VIF {"id": "3195884f-aa65-4b41-af9d-9f8d326dbc95", "address": "fa:16:3e:41:92:45", "network": {"id": "089473b3-6053-4734-b86a-3935a1be492f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2061586110-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b756ee34b70d4189a05bf0a0cb751e60", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3195884f-aa", "ovs_interfaceid": "3195884f-aa65-4b41-af9d-9f8d326dbc95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:07:35.737308 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:41:92:45,bridge_name='br-int',has_traffic_filtering=True,id=3195884f-aa65-4b41-af9d-9f8d326dbc95,network=Network(089473b3-6053-4734-b86a-3935a1be492f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3195884f-aa') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:07:35.738183 np0035104604 nova-compute[107505]: DEBUG os_vif [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:41:92:45,bridge_name='br-int',has_traffic_filtering=True,id=3195884f-aa65-4b41-af9d-9f8d326dbc95,network=Network(089473b3-6053-4734-b86a-3935a1be492f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3195884f-aa') {{(pid=107505) plug /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:76}} Aug 30 14:07:35.739009 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:35.739907 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:07:35.740526 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Aug 30 14:07:35.745427 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:35.747346 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3195884f-aa, may_exist=True, interface_attrs={}) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:07:35.748523 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3195884f-aa, col_values=(('external_ids', {'iface-id': '3195884f-aa65-4b41-af9d-9f8d326dbc95', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:41:92:45', 'vm-uuid': '8f0941f9-96cc-445a-9d3b-83aa415ff950'}),), if_exists=True) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:07:35.763795 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:07:35.765555 np0035104604 nova-compute[107505]: INFO os_vif [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:41:92:45,bridge_name='br-int',has_traffic_filtering=True,id=3195884f-aa65-4b41-af9d-9f8d326dbc95,network=Network(089473b3-6053-4734-b86a-3935a1be492f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3195884f-aa') Aug 30 14:07:35.818558 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] No BDM found with device name vda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:07:35.819216 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] No BDM found with device name hda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:07:35.819600 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] No VIF found with MAC fa:16:3e:41:92:45, not building metadata {{(pid=107505) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12068}} Aug 30 14:07:35.820513 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Using config drive Aug 30 14:07:35.849892 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] rbd image 8f0941f9-96cc-445a-9d3b-83aa415ff950_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:07:36.201357 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-7dd21448-cf66-42e1-9286-0f6fdeb5dab2 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.888s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:36.211420 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-7dd21448-cf66-42e1-9286-0f6fdeb5dab2 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:07:36.230177 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-7dd21448-cf66-42e1-9286-0f6fdeb5dab2 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:07:36.251640 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Creating config drive at /opt/stack/data/nova/instances/8f0941f9-96cc-445a-9d3b-83aa415ff950/disk.config Aug 30 14:07:36.256181 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Running cmd (subprocess): genisoimage -o /opt/stack/data/nova/instances/8f0941f9-96cc-445a-9d3b-83aa415ff950/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 27.1.0 -quiet -J -r -V config-2 /tmp/tmpl8tdfo2v {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:36.289804 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-7dd21448-cf66-42e1-9286-0f6fdeb5dab2 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.624s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:36.295240 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CMD "genisoimage -o /opt/stack/data/nova/instances/8f0941f9-96cc-445a-9d3b-83aa415ff950/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 27.1.0 -quiet -J -r -V config-2 /tmp/tmpl8tdfo2v" returned: 0 in 0.039s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:36.475867 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] rbd image 8f0941f9-96cc-445a-9d3b-83aa415ff950_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:07:36.479534 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/8f0941f9-96cc-445a-9d3b-83aa415ff950/disk.config 8f0941f9-96cc-445a-9d3b-83aa415ff950_disk.config --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:36.539267 np0035104604 nova-compute[107505]: INFO nova.scheduler.client.report [None req-7dd21448-cf66-42e1-9286-0f6fdeb5dab2 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Deleted allocations for instance bfa80c86-477c-4e04-96e3-d7fee174f726 Aug 30 14:07:36.643523 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-7dd21448-cf66-42e1-9286-0f6fdeb5dab2 tempest-ServersAdmin275Test-1203166925 tempest-ServersAdmin275Test-1203166925-project-member] Lock "bfa80c86-477c-4e04-96e3-d7fee174f726" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 4.823s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:36.696657 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CMD "rbd import --pool vms /opt/stack/data/nova/instances/8f0941f9-96cc-445a-9d3b-83aa415ff950/disk.config 8f0941f9-96cc-445a-9d3b-83aa415ff950_disk.config --image-format=2 --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.217s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:36.697659 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Deleting local config drive /opt/stack/data/nova/instances/8f0941f9-96cc-445a-9d3b-83aa415ff950/disk.config because it was imported into RBD. Aug 30 14:07:36.724681 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:36.735659 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:36.740600 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:36.908893 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:37.651769 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-62d4068a-a76e-4486-8096-fc63fe810ba8 req-a304f1cf-f3ed-4d53-90e5-a0a5e0c00a33 service nova] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Received event network-vif-plugged-3195884f-aa65-4b41-af9d-9f8d326dbc95 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:07:37.652477 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-62d4068a-a76e-4486-8096-fc63fe810ba8 req-a304f1cf-f3ed-4d53-90e5-a0a5e0c00a33 service nova] Acquiring lock "8f0941f9-96cc-445a-9d3b-83aa415ff950-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:37.652592 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-62d4068a-a76e-4486-8096-fc63fe810ba8 req-a304f1cf-f3ed-4d53-90e5-a0a5e0c00a33 service nova] Lock "8f0941f9-96cc-445a-9d3b-83aa415ff950-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:37.653492 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-62d4068a-a76e-4486-8096-fc63fe810ba8 req-a304f1cf-f3ed-4d53-90e5-a0a5e0c00a33 service nova] Lock "8f0941f9-96cc-445a-9d3b-83aa415ff950-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:37.653985 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-62d4068a-a76e-4486-8096-fc63fe810ba8 req-a304f1cf-f3ed-4d53-90e5-a0a5e0c00a33 service nova] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Processing event network-vif-plugged-3195884f-aa65-4b41-af9d-9f8d326dbc95 {{(pid=107505) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10792}} Aug 30 14:07:37.654375 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-62d4068a-a76e-4486-8096-fc63fe810ba8 req-a304f1cf-f3ed-4d53-90e5-a0a5e0c00a33 service nova] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Received event network-vif-plugged-3195884f-aa65-4b41-af9d-9f8d326dbc95 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:07:37.654754 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-62d4068a-a76e-4486-8096-fc63fe810ba8 req-a304f1cf-f3ed-4d53-90e5-a0a5e0c00a33 service nova] Acquiring lock "8f0941f9-96cc-445a-9d3b-83aa415ff950-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:37.655529 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-62d4068a-a76e-4486-8096-fc63fe810ba8 req-a304f1cf-f3ed-4d53-90e5-a0a5e0c00a33 service nova] Lock "8f0941f9-96cc-445a-9d3b-83aa415ff950-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:37.655980 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-62d4068a-a76e-4486-8096-fc63fe810ba8 req-a304f1cf-f3ed-4d53-90e5-a0a5e0c00a33 service nova] Lock "8f0941f9-96cc-445a-9d3b-83aa415ff950-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:37.656417 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-62d4068a-a76e-4486-8096-fc63fe810ba8 req-a304f1cf-f3ed-4d53-90e5-a0a5e0c00a33 service nova] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] No waiting events found dispatching network-vif-plugged-3195884f-aa65-4b41-af9d-9f8d326dbc95 {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:07:37.657024 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [req-62d4068a-a76e-4486-8096-fc63fe810ba8 req-a304f1cf-f3ed-4d53-90e5-a0a5e0c00a33 service nova] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Received unexpected event network-vif-plugged-3195884f-aa65-4b41-af9d-9f8d326dbc95 for instance with vm_state building and task_state spawning. Aug 30 14:07:37.795260 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Instance event wait completed in 0 seconds for network-vif-plugged {{(pid=107505) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Aug 30 14:07:37.797642 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Started> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:07:37.797985 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] VM Started (Lifecycle Event) Aug 30 14:07:37.807118 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Guest created on hypervisor {{(pid=107505) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4411}} Aug 30 14:07:37.810300 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Instance spawned successfully. Aug 30 14:07:37.810735 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:894}} Aug 30 14:07:37.821674 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:07:37.834111 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:07:37.841229 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Found default for hw_cdrom_bus of ide {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:07:37.841486 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Found default for hw_disk_bus of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:07:37.842309 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Found default for hw_input_bus of None {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:07:37.843038 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Found default for hw_pointer_model of None {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:07:37.843824 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Found default for hw_video_model of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:07:37.844932 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Found default for hw_vif_model of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:07:37.861173 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] During sync_power_state the instance has a pending task (spawning). Skip. Aug 30 14:07:37.861606 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Paused> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:07:37.861973 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] VM Paused (Lifecycle Event) Aug 30 14:07:37.888412 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:07:37.902426 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Resumed> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:07:37.902877 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] VM Resumed (Lifecycle Event) Aug 30 14:07:37.931229 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Took 8.56 seconds to spawn the instance on the hypervisor. Aug 30 14:07:37.932058 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:07:37.940197 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:07:37.948888 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:07:37.968892 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] During sync_power_state the instance has a pending task (spawning). Skip. Aug 30 14:07:38.055915 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Took 12.01 seconds to build instance. Aug 30 14:07:38.076799 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-83f5fddb-49c5-4b36-a9d5-d2af17346419 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "8f0941f9-96cc-445a-9d3b-83aa415ff950" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 12.142s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:38.658156 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:38.711844 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:40.703300 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:40.751031 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:42.244264 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Acquiring lock "4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:42.244907 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:42.282454 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Starting instance... {{(pid=107505) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} Aug 30 14:07:42.560062 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:42.560694 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:42.567121 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=107505) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Aug 30 14:07:42.568329 np0035104604 nova-compute[107505]: INFO nova.compute.claims [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Claim successful on node np0035104604 Aug 30 14:07:43.403374 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:43.659920 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:44.190324 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.787s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:44.196424 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:07:44.217294 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:07:44.250360 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.689s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:44.251226 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Start building networks asynchronously for instance. {{(pid=107505) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} Aug 30 14:07:44.332840 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Allocating IP information in the background. {{(pid=107505) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} Aug 30 14:07:44.335695 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] allocate_for_instance() {{(pid=107505) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} Aug 30 14:07:44.491955 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Aug 30 14:07:44.520178 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Start building block device mappings for instance. {{(pid=107505) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} Aug 30 14:07:44.645500 np0035104604 nova-compute[107505]: DEBUG nova.policy [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5a6677b100d348b69575a22d363e18f4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b756ee34b70d4189a05bf0a0cb751e60', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=107505) authorize /opt/stack/nova/nova/policy.py:203}} Aug 30 14:07:44.781890 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Start spawning the instance on the hypervisor. {{(pid=107505) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} Aug 30 14:07:44.782955 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Creating instance directory {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4717}} Aug 30 14:07:44.783392 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Creating image(s) Aug 30 14:07:44.814323 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] rbd image 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:07:44.845858 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] rbd image 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:07:44.872389 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] rbd image 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:07:44.877054 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e --force-share --output=json {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:44.981782 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CMD "/opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e --force-share --output=json" returned: 0 in 0.103s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:44.983619 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Acquiring lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:44.985591 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.002s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:44.987029 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:45.031152 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] rbd image 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:07:45.037436 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:45.427687 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CMD "rbd import --pool vms /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.390s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:45.484879 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Successfully created port: 3c99b14a-715d-489f-9459-82876d9ca624 {{(pid=107505) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} Aug 30 14:07:45.637057 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:45.693506 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] resizing rbd image 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc_disk to 1073741824 {{(pid=107505) resize /opt/stack/nova/nova/storage/rbd_utils.py:288}} Aug 30 14:07:45.752879 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:45.815860 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Created local disks {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4851}} Aug 30 14:07:45.816334 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Ensure instance console log exists: /opt/stack/data/nova/instances/4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc/console.log {{(pid=107505) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4603}} Aug 30 14:07:45.816844 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:45.817685 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:45.818789 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:47.254148 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Successfully updated port: 3c99b14a-715d-489f-9459-82876d9ca624 {{(pid=107505) _update_port /opt/stack/nova/nova/network/neutron.py:586}} Aug 30 14:07:47.298321 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Acquiring lock "refresh_cache-4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:07:47.298612 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Acquired lock "refresh_cache-4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:07:47.298901 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Building network info cache for instance {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} Aug 30 14:07:47.308273 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-0b1d9fa3-813e-4998-a4b9-10e428a15200 req-f598427d-490e-4525-a844-7da5c8027849 service nova] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Received event network-changed-3c99b14a-715d-489f-9459-82876d9ca624 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:07:47.308737 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-0b1d9fa3-813e-4998-a4b9-10e428a15200 req-f598427d-490e-4525-a844-7da5c8027849 service nova] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Refreshing instance network info cache due to event network-changed-3c99b14a-715d-489f-9459-82876d9ca624. {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11037}} Aug 30 14:07:47.309193 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-0b1d9fa3-813e-4998-a4b9-10e428a15200 req-f598427d-490e-4525-a844-7da5c8027849 service nova] Acquiring lock "refresh_cache-4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:07:47.551538 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:47.599704 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Instance cache missing network info. {{(pid=107505) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} Aug 30 14:07:48.656498 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:07:48.656498 np0035104604 nova-compute[107505]: INFO nova.compute.manager [-] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] VM Stopped (Lifecycle Event) Aug 30 14:07:48.662067 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:48.675421 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-f153ba93-c159-413d-8748-06506a2c1b0d None None] [instance: bfa80c86-477c-4e04-96e3-d7fee174f726] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:07:48.765436 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Updating instance_info_cache with network_info: [{"id": "3c99b14a-715d-489f-9459-82876d9ca624", "address": "fa:16:3e:11:c2:65", "network": {"id": "089473b3-6053-4734-b86a-3935a1be492f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2061586110-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b756ee34b70d4189a05bf0a0cb751e60", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c99b14a-71", "ovs_interfaceid": "3c99b14a-715d-489f-9459-82876d9ca624", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:07:48.807976 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Releasing lock "refresh_cache-4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:07:48.808626 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Instance network_info: |[{"id": "3c99b14a-715d-489f-9459-82876d9ca624", "address": "fa:16:3e:11:c2:65", "network": {"id": "089473b3-6053-4734-b86a-3935a1be492f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2061586110-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b756ee34b70d4189a05bf0a0cb751e60", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c99b14a-71", "ovs_interfaceid": "3c99b14a-715d-489f-9459-82876d9ca624", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=107505) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} Aug 30 14:07:48.809176 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-0b1d9fa3-813e-4998-a4b9-10e428a15200 req-f598427d-490e-4525-a844-7da5c8027849 service nova] Acquired lock "refresh_cache-4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:07:48.809602 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-0b1d9fa3-813e-4998-a4b9-10e428a15200 req-f598427d-490e-4525-a844-7da5c8027849 service nova] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Refreshing network info cache for port 3c99b14a-715d-489f-9459-82876d9ca624 {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} Aug 30 14:07:48.818097 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Start _get_guest_xml network_info=[{"id": "3c99b14a-715d-489f-9459-82876d9ca624", "address": "fa:16:3e:11:c2:65", "network": {"id": "089473b3-6053-4734-b86a-3935a1be492f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2061586110-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b756ee34b70d4189a05bf0a0cb751e60", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c99b14a-71", "ovs_interfaceid": "3c99b14a-715d-489f-9459-82876d9ca624", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'ide', 'dev': 'hda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-08-30T14:01:32Z,direct_url=,disk_format='qcow2',id=a32d892d-d3ba-4b7e-9bab-7e06f730b9e0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='7f1ed4be4a61468fa9c662d2b5fbcb56',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-08-30T14:01:43Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'disk_bus': 'virtio', 'image_id': 'a32d892d-d3ba-4b7e-9bab-7e06f730b9e0'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=107505) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7553}} Aug 30 14:07:49.012837 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Aug 30 14:07:49.021950 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Searching host: 'np0035104604' for CPU controller through CGroups V1... {{(pid=107505) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1650}} Aug 30 14:07:49.022901 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CPU controller missing on host. {{(pid=107505) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1660}} Aug 30 14:07:49.027250 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Searching host: 'np0035104604' for CPU controller through CGroups V2... {{(pid=107505) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1669}} Aug 30 14:07:49.027864 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CPU controller found on host. {{(pid=107505) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1676}} Aug 30 14:07:49.030161 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=107505) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5390}} Aug 30 14:07:49.033245 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-08-30T14:02:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-08-30T14:01:32Z,direct_url=,disk_format='qcow2',id=a32d892d-d3ba-4b7e-9bab-7e06f730b9e0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='7f1ed4be4a61468fa9c662d2b5fbcb56',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-08-30T14:01:43Z,virtual_size=,visibility=), allow threads: True {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Aug 30 14:07:49.033894 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Flavor limits 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Aug 30 14:07:49.034383 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Image limits 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Aug 30 14:07:49.034759 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Flavor pref 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Aug 30 14:07:49.035092 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Image pref 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Aug 30 14:07:49.035452 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Aug 30 14:07:49.035841 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Aug 30 14:07:49.036231 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=107505) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Aug 30 14:07:49.036955 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Got 1 possible topologies {{(pid=107505) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Aug 30 14:07:49.037387 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Aug 30 14:07:49.038027 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Aug 30 14:07:49.058270 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:49.862467 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.798s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:49.905088 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] rbd image 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:07:49.922374 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:50.465471 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-0b1d9fa3-813e-4998-a4b9-10e428a15200 req-f598427d-490e-4525-a844-7da5c8027849 service nova] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Updated VIF entry in instance network info cache for port 3c99b14a-715d-489f-9459-82876d9ca624. {{(pid=107505) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} Aug 30 14:07:50.466351 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-0b1d9fa3-813e-4998-a4b9-10e428a15200 req-f598427d-490e-4525-a844-7da5c8027849 service nova] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Updating instance_info_cache with network_info: [{"id": "3c99b14a-715d-489f-9459-82876d9ca624", "address": "fa:16:3e:11:c2:65", "network": {"id": "089473b3-6053-4734-b86a-3935a1be492f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2061586110-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b756ee34b70d4189a05bf0a0cb751e60", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c99b14a-71", "ovs_interfaceid": "3c99b14a-715d-489f-9459-82876d9ca624", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:07:50.486294 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-0b1d9fa3-813e-4998-a4b9-10e428a15200 req-f598427d-490e-4525-a844-7da5c8027849 service nova] Releasing lock "refresh_cache-4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:07:50.754286 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:50.887396 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.965s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:50.889872 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-08-30T14:07:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1373465328',display_name='tempest-ServersAdminTestJSON-server-1373465328',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-serversadmintestjson-server-1373465328',id=9,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b756ee34b70d4189a05bf0a0cb751e60',ramdisk_id='',reservation_id='r-5xdqendf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-499476127',owner_user_name='tempest-ServersAdminTestJSON-499476127-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-08-30T14:07:45Z,user_data=None,user_id='5a6677b100d348b69575a22d363e18f4',uuid=4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3c99b14a-715d-489f-9459-82876d9ca624", "address": "fa:16:3e:11:c2:65", "network": {"id": "089473b3-6053-4734-b86a-3935a1be492f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2061586110-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b756ee34b70d4189a05bf0a0cb751e60", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c99b14a-71", "ovs_interfaceid": "3c99b14a-715d-489f-9459-82876d9ca624", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=qemu {{(pid=107505) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Aug 30 14:07:50.890495 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Converting VIF {"id": "3c99b14a-715d-489f-9459-82876d9ca624", "address": "fa:16:3e:11:c2:65", "network": {"id": "089473b3-6053-4734-b86a-3935a1be492f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2061586110-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b756ee34b70d4189a05bf0a0cb751e60", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c99b14a-71", "ovs_interfaceid": "3c99b14a-715d-489f-9459-82876d9ca624", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:07:50.892081 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:11:c2:65,bridge_name='br-int',has_traffic_filtering=True,id=3c99b14a-715d-489f-9459-82876d9ca624,network=Network(089473b3-6053-4734-b86a-3935a1be492f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c99b14a-71') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:07:50.893233 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lazy-loading 'pci_devices' on Instance uuid 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] End _get_guest_xml xml= Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: instance-00000009 Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: 131072 Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: 1 Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: tempest-ServersAdminTestJSON-server-1373465328 Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: 2023-08-30 14:07:49 Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: 128 Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: 1 Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: 0 Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: 0 Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: 1 Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: tempest-ServersAdminTestJSON-499476127-project-member Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: tempest-ServersAdminTestJSON-499476127 Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: OpenStack Foundation Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: OpenStack Nova Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: 27.1.0 Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Virtual Machine Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: hvm Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Nehalem Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: /dev/urandom Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: Aug 30 14:07:50.909223 np0035104604 nova-compute[107505]: {{(pid=107505) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7559}} Aug 30 14:07:50.917501 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Preparing to wait for external event network-vif-plugged-3c99b14a-715d-489f-9459-82876d9ca624 {{(pid=107505) prepare_for_instance_event /opt/stack/nova/nova/compute/manager.py:283}} Aug 30 14:07:50.917501 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Acquiring lock "4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:50.917501 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:50.917501 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:50.917501 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-08-30T14:07:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1373465328',display_name='tempest-ServersAdminTestJSON-server-1373465328',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-serversadmintestjson-server-1373465328',id=9,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b756ee34b70d4189a05bf0a0cb751e60',ramdisk_id='',reservation_id='r-5xdqendf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-499476127',owner_user_name='tempest-ServersAdminTestJSON-499476127-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-08-30T14:07:45Z,user_data=None,user_id='5a6677b100d348b69575a22d363e18f4',uuid=4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3c99b14a-715d-489f-9459-82876d9ca624", "address": "fa:16:3e:11:c2:65", "network": {"id": "089473b3-6053-4734-b86a-3935a1be492f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2061586110-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b756ee34b70d4189a05bf0a0cb751e60", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c99b14a-71", "ovs_interfaceid": "3c99b14a-715d-489f-9459-82876d9ca624", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Aug 30 14:07:50.918333 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Converting VIF {"id": "3c99b14a-715d-489f-9459-82876d9ca624", "address": "fa:16:3e:11:c2:65", "network": {"id": "089473b3-6053-4734-b86a-3935a1be492f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2061586110-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b756ee34b70d4189a05bf0a0cb751e60", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c99b14a-71", "ovs_interfaceid": "3c99b14a-715d-489f-9459-82876d9ca624", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:07:50.918333 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:11:c2:65,bridge_name='br-int',has_traffic_filtering=True,id=3c99b14a-715d-489f-9459-82876d9ca624,network=Network(089473b3-6053-4734-b86a-3935a1be492f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c99b14a-71') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:07:50.918333 np0035104604 nova-compute[107505]: DEBUG os_vif [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:11:c2:65,bridge_name='br-int',has_traffic_filtering=True,id=3c99b14a-715d-489f-9459-82876d9ca624,network=Network(089473b3-6053-4734-b86a-3935a1be492f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c99b14a-71') {{(pid=107505) plug /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:76}} Aug 30 14:07:50.918333 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:50.919671 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:07:50.919671 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Aug 30 14:07:50.923984 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:50.924436 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3c99b14a-71, may_exist=True, interface_attrs={}) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:07:50.925111 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3c99b14a-71, col_values=(('external_ids', {'iface-id': '3c99b14a-715d-489f-9459-82876d9ca624', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:11:c2:65', 'vm-uuid': '4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc'}),), if_exists=True) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:07:50.926602 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:50.930384 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:07:50.932832 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:50.934869 np0035104604 nova-compute[107505]: INFO os_vif [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:11:c2:65,bridge_name='br-int',has_traffic_filtering=True,id=3c99b14a-715d-489f-9459-82876d9ca624,network=Network(089473b3-6053-4734-b86a-3935a1be492f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c99b14a-71') Aug 30 14:07:50.989798 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] No BDM found with device name vda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:07:50.990465 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] No BDM found with device name hda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:07:50.990794 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] No VIF found with MAC fa:16:3e:11:c2:65, not building metadata {{(pid=107505) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12068}} Aug 30 14:07:50.992569 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Using config drive Aug 30 14:07:51.034741 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] rbd image 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:07:51.455138 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Creating config drive at /opt/stack/data/nova/instances/4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc/disk.config Aug 30 14:07:51.460098 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Running cmd (subprocess): genisoimage -o /opt/stack/data/nova/instances/4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 27.1.0 -quiet -J -r -V config-2 /tmp/tmpd_ddorwh {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:51.494441 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CMD "genisoimage -o /opt/stack/data/nova/instances/4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 27.1.0 -quiet -J -r -V config-2 /tmp/tmpd_ddorwh" returned: 0 in 0.034s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:51.534153 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] rbd image 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:07:51.546870 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc/disk.config 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc_disk.config --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:51.858977 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CMD "rbd import --pool vms /opt/stack/data/nova/instances/4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc/disk.config 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc_disk.config --image-format=2 --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.312s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:51.859969 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Deleting local config drive /opt/stack/data/nova/instances/4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc/disk.config because it was imported into RBD. Aug 30 14:07:51.898217 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:51.924335 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:51.994222 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:52.147750 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:53.105416 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-c3ba1631-9291-47f1-a40c-f51d6ba3f6bb req-54218a81-e9ea-40d2-a816-323b56ca81fe service nova] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Received event network-vif-plugged-3c99b14a-715d-489f-9459-82876d9ca624 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:07:53.106086 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-c3ba1631-9291-47f1-a40c-f51d6ba3f6bb req-54218a81-e9ea-40d2-a816-323b56ca81fe service nova] Acquiring lock "4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:53.106468 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-c3ba1631-9291-47f1-a40c-f51d6ba3f6bb req-54218a81-e9ea-40d2-a816-323b56ca81fe service nova] Lock "4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:53.106903 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-c3ba1631-9291-47f1-a40c-f51d6ba3f6bb req-54218a81-e9ea-40d2-a816-323b56ca81fe service nova] Lock "4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:53.107368 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-c3ba1631-9291-47f1-a40c-f51d6ba3f6bb req-54218a81-e9ea-40d2-a816-323b56ca81fe service nova] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Processing event network-vif-plugged-3c99b14a-715d-489f-9459-82876d9ca624 {{(pid=107505) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10792}} Aug 30 14:07:53.586445 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Acquiring lock "f4820198-6af7-4434-b811-c208d50a5743" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:53.587312 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Lock "f4820198-6af7-4434-b811-c208d50a5743" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:53.618855 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: f4820198-6af7-4434-b811-c208d50a5743] Starting instance... {{(pid=107505) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} Aug 30 14:07:53.634808 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Started> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:07:53.635245 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] VM Started (Lifecycle Event) Aug 30 14:07:53.643572 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Instance event wait completed in 0 seconds for network-vif-plugged {{(pid=107505) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Aug 30 14:07:53.650061 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Guest created on hypervisor {{(pid=107505) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4411}} Aug 30 14:07:53.662231 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:53.669328 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:07:53.669714 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Instance spawned successfully. Aug 30 14:07:53.675437 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:894}} Aug 30 14:07:53.703405 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:07:53.893817 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] During sync_power_state the instance has a pending task (spawning). Skip. Aug 30 14:07:53.894918 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Paused> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:07:53.894918 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] VM Paused (Lifecycle Event) Aug 30 14:07:53.899352 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Found default for hw_cdrom_bus of ide {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:07:53.899675 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Found default for hw_disk_bus of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:07:53.900377 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Found default for hw_input_bus of None {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:07:53.901001 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Found default for hw_pointer_model of None {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:07:53.901792 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Found default for hw_video_model of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:07:53.906017 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Found default for hw_vif_model of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:07:53.915189 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:53.915327 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:53.917305 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:07:53.922504 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=107505) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Aug 30 14:07:53.922771 np0035104604 nova-compute[107505]: INFO nova.compute.claims [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: f4820198-6af7-4434-b811-c208d50a5743] Claim successful on node np0035104604 Aug 30 14:07:53.929287 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Resumed> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:07:53.929530 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] VM Resumed (Lifecycle Event) Aug 30 14:07:53.947203 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:07:53.950296 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:07:53.991115 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] During sync_power_state the instance has a pending task (spawning). Skip. Aug 30 14:07:53.994256 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Took 9.21 seconds to spawn the instance on the hypervisor. Aug 30 14:07:53.994655 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:07:54.264911 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Took 11.92 seconds to build instance. Aug 30 14:07:54.503095 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-39e1ef16-c2a8-4c27-aad3-98eb2fc00f0a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 12.258s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:55.087236 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:55.352283 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-50c5deb4-d006-49df-98fc-6fac677f5a09 req-c89b346f-223d-4994-a183-978da50cc032 service nova] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Received event network-vif-plugged-3c99b14a-715d-489f-9459-82876d9ca624 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:07:55.353360 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-50c5deb4-d006-49df-98fc-6fac677f5a09 req-c89b346f-223d-4994-a183-978da50cc032 service nova] Acquiring lock "4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:55.353547 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-50c5deb4-d006-49df-98fc-6fac677f5a09 req-c89b346f-223d-4994-a183-978da50cc032 service nova] Lock "4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:55.354988 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-50c5deb4-d006-49df-98fc-6fac677f5a09 req-c89b346f-223d-4994-a183-978da50cc032 service nova] Lock "4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:55.355517 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-50c5deb4-d006-49df-98fc-6fac677f5a09 req-c89b346f-223d-4994-a183-978da50cc032 service nova] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] No waiting events found dispatching network-vif-plugged-3c99b14a-715d-489f-9459-82876d9ca624 {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:07:55.356007 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [req-50c5deb4-d006-49df-98fc-6fac677f5a09 req-c89b346f-223d-4994-a183-978da50cc032 service nova] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Received unexpected event network-vif-plugged-3c99b14a-715d-489f-9459-82876d9ca624 for instance with vm_state active and task_state None. Aug 30 14:07:55.927947 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:56.107246 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 1.020s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:56.113066 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:07:56.134374 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:07:56.211555 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.296s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:56.212571 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: f4820198-6af7-4434-b811-c208d50a5743] Start building networks asynchronously for instance. {{(pid=107505) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} Aug 30 14:07:56.316574 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: f4820198-6af7-4434-b811-c208d50a5743] Allocating IP information in the background. {{(pid=107505) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} Aug 30 14:07:56.317023 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: f4820198-6af7-4434-b811-c208d50a5743] allocate_for_instance() {{(pid=107505) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} Aug 30 14:07:56.448507 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: f4820198-6af7-4434-b811-c208d50a5743] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Aug 30 14:07:56.487992 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: f4820198-6af7-4434-b811-c208d50a5743] Start building block device mappings for instance. {{(pid=107505) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} Aug 30 14:07:56.584658 np0035104604 nova-compute[107505]: DEBUG nova.policy [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd8043c2265984da88bd2252d8b2cf983', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b35dffa3f90f40559bd146f98ed3083d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=107505) authorize /opt/stack/nova/nova/policy.py:203}} Aug 30 14:07:56.903981 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: f4820198-6af7-4434-b811-c208d50a5743] Start spawning the instance on the hypervisor. {{(pid=107505) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} Aug 30 14:07:56.903981 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: f4820198-6af7-4434-b811-c208d50a5743] Creating instance directory {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4717}} Aug 30 14:07:56.903981 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: f4820198-6af7-4434-b811-c208d50a5743] Creating image(s) Aug 30 14:07:56.940204 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] rbd image f4820198-6af7-4434-b811-c208d50a5743_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:07:57.010430 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] rbd image f4820198-6af7-4434-b811-c208d50a5743_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:07:57.040154 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] rbd image f4820198-6af7-4434-b811-c208d50a5743_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:07:57.053318 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e --force-share --output=json {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:57.143947 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] CMD "/opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e --force-share --output=json" returned: 0 in 0.090s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:57.144806 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Acquiring lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:57.146338 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:57.146799 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:57.181212 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] rbd image f4820198-6af7-4434-b811-c208d50a5743_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:07:57.191552 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e f4820198-6af7-4434-b811-c208d50a5743_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:07:57.585431 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] CMD "rbd import --pool vms /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e f4820198-6af7-4434-b811-c208d50a5743_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.394s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:07:57.792825 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:07:57.803514 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] resizing rbd image f4820198-6af7-4434-b811-c208d50a5743_disk to 1073741824 {{(pid=107505) resize /opt/stack/nova/nova/storage/rbd_utils.py:288}} Aug 30 14:07:57.982826 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: f4820198-6af7-4434-b811-c208d50a5743] Created local disks {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4851}} Aug 30 14:07:57.983635 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: f4820198-6af7-4434-b811-c208d50a5743] Ensure instance console log exists: /opt/stack/data/nova/instances/f4820198-6af7-4434-b811-c208d50a5743/console.log {{(pid=107505) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4603}} Aug 30 14:07:57.984675 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:07:57.986172 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:07:57.986847 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:07:58.295115 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: f4820198-6af7-4434-b811-c208d50a5743] Successfully created port: ef896d3c-f7c0-4559-859b-b05102fdb388 {{(pid=107505) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} Aug 30 14:07:58.662846 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:00.335156 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: f4820198-6af7-4434-b811-c208d50a5743] Successfully updated port: ef896d3c-f7c0-4559-859b-b05102fdb388 {{(pid=107505) _update_port /opt/stack/nova/nova/network/neutron.py:586}} Aug 30 14:08:00.382593 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Acquiring lock "refresh_cache-f4820198-6af7-4434-b811-c208d50a5743" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:08:00.382976 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Acquired lock "refresh_cache-f4820198-6af7-4434-b811-c208d50a5743" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:08:00.383330 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: f4820198-6af7-4434-b811-c208d50a5743] Building network info cache for instance {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} Aug 30 14:08:00.620388 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-20811911-584a-4a16-9c3e-a70d042dc67f req-5bb4c928-ce59-4958-8f12-450ac191bfa7 service nova] [instance: f4820198-6af7-4434-b811-c208d50a5743] Received event network-changed-ef896d3c-f7c0-4559-859b-b05102fdb388 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:08:00.621030 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-20811911-584a-4a16-9c3e-a70d042dc67f req-5bb4c928-ce59-4958-8f12-450ac191bfa7 service nova] [instance: f4820198-6af7-4434-b811-c208d50a5743] Refreshing instance network info cache due to event network-changed-ef896d3c-f7c0-4559-859b-b05102fdb388. {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11037}} Aug 30 14:08:00.621477 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-20811911-584a-4a16-9c3e-a70d042dc67f req-5bb4c928-ce59-4958-8f12-450ac191bfa7 service nova] Acquiring lock "refresh_cache-f4820198-6af7-4434-b811-c208d50a5743" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:08:00.662810 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:00.683701 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: f4820198-6af7-4434-b811-c208d50a5743] Instance cache missing network info. {{(pid=107505) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} Aug 30 14:08:01.029936 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:01.720462 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: f4820198-6af7-4434-b811-c208d50a5743] Updating instance_info_cache with network_info: [{"id": "ef896d3c-f7c0-4559-859b-b05102fdb388", "address": "fa:16:3e:95:91:0b", "network": {"id": "33008139-2fd9-496e-ac17-91db64b6f4c6", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1161556951-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b35dffa3f90f40559bd146f98ed3083d", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapef896d3c-f7", "ovs_interfaceid": "ef896d3c-f7c0-4559-859b-b05102fdb388", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:08:01.879097 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Releasing lock "refresh_cache-f4820198-6af7-4434-b811-c208d50a5743" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:08:01.879528 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: f4820198-6af7-4434-b811-c208d50a5743] Instance network_info: |[{"id": "ef896d3c-f7c0-4559-859b-b05102fdb388", "address": "fa:16:3e:95:91:0b", "network": {"id": "33008139-2fd9-496e-ac17-91db64b6f4c6", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1161556951-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b35dffa3f90f40559bd146f98ed3083d", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapef896d3c-f7", "ovs_interfaceid": "ef896d3c-f7c0-4559-859b-b05102fdb388", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=107505) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} Aug 30 14:08:01.880408 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-20811911-584a-4a16-9c3e-a70d042dc67f req-5bb4c928-ce59-4958-8f12-450ac191bfa7 service nova] Acquired lock "refresh_cache-f4820198-6af7-4434-b811-c208d50a5743" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:08:01.880615 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-20811911-584a-4a16-9c3e-a70d042dc67f req-5bb4c928-ce59-4958-8f12-450ac191bfa7 service nova] [instance: f4820198-6af7-4434-b811-c208d50a5743] Refreshing network info cache for port ef896d3c-f7c0-4559-859b-b05102fdb388 {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} Aug 30 14:08:01.884528 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: f4820198-6af7-4434-b811-c208d50a5743] Start _get_guest_xml network_info=[{"id": "ef896d3c-f7c0-4559-859b-b05102fdb388", "address": "fa:16:3e:95:91:0b", "network": {"id": "33008139-2fd9-496e-ac17-91db64b6f4c6", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1161556951-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b35dffa3f90f40559bd146f98ed3083d", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapef896d3c-f7", "ovs_interfaceid": "ef896d3c-f7c0-4559-859b-b05102fdb388", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'ide', 'dev': 'hda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-08-30T14:01:32Z,direct_url=,disk_format='qcow2',id=a32d892d-d3ba-4b7e-9bab-7e06f730b9e0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='7f1ed4be4a61468fa9c662d2b5fbcb56',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-08-30T14:01:43Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'disk_bus': 'virtio', 'image_id': 'a32d892d-d3ba-4b7e-9bab-7e06f730b9e0'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=107505) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7553}} Aug 30 14:08:01.895900 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Aug 30 14:08:02.041128 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Searching host: 'np0035104604' for CPU controller through CGroups V1... {{(pid=107505) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1650}} Aug 30 14:08:02.041645 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] CPU controller missing on host. {{(pid=107505) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1660}} Aug 30 14:08:02.047073 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Searching host: 'np0035104604' for CPU controller through CGroups V2... {{(pid=107505) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1669}} Aug 30 14:08:02.047814 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] CPU controller found on host. {{(pid=107505) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1676}} Aug 30 14:08:02.049391 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=107505) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5390}} Aug 30 14:08:02.050202 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-08-30T14:07:48Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1045729611',id=35,is_public=True,memory_mb=128,name='tempest-flavor_with_ephemeral_0-1293507177',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-08-30T14:01:32Z,direct_url=,disk_format='qcow2',id=a32d892d-d3ba-4b7e-9bab-7e06f730b9e0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='7f1ed4be4a61468fa9c662d2b5fbcb56',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-08-30T14:01:43Z,virtual_size=,visibility=), allow threads: True {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Aug 30 14:08:02.050803 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Flavor limits 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Aug 30 14:08:02.051196 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Image limits 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Aug 30 14:08:02.051641 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Flavor pref 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Aug 30 14:08:02.051998 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Image pref 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Aug 30 14:08:02.052253 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Aug 30 14:08:02.052700 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Aug 30 14:08:02.053238 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=107505) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Aug 30 14:08:02.053843 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Got 1 possible topologies {{(pid=107505) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Aug 30 14:08:02.053843 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Aug 30 14:08:02.054239 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Aug 30 14:08:02.070286 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:08:02.618309 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Rebuilding instance Aug 30 14:08:03.090753 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 1.021s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:08:03.121525 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] rbd image f4820198-6af7-4434-b811-c208d50a5743_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:08:03.125208 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:08:03.642546 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:08:03.664536 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:03.976751 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:03.997419 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:04.010641 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Instance destroyed successfully. Aug 30 14:08:04.021424 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Instance destroyed successfully. Aug 30 14:08:04.022804 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2023-08-30T14:07:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1551096224',display_name='tempest-ServersAdminTestJSON-server-1551096224',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-serversadmintestjson-server-1551096224',id=7,image_ref='bd27109d-8758-41af-af44-0e764a2addca',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-08-30T14:07:34Z,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b756ee34b70d4189a05bf0a0cb751e60',ramdisk_id='',reservation_id='r-mntu6p6e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='bd27109d-8758-41af-af44-0e764a2addca',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.1-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServersAdminTestJSON-499476127',owner_user_name='tempest-ServersAdminTestJSON-499476127-project-member'},tags=,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2023-08-30T14:08:02Z,user_data=None,user_id='5a6677b100d348b69575a22d363e18f4',uuid=3e4eb957-d13b-4d8c-be3b-bad1d61114fe,vcpu_model=,vcpus=1,vm_mode=None,vm_state='error') vif={"id": "614532f0-0086-4382-b9bd-08c309fc6548", "address": "fa:16:3e:e5:63:35", "network": {"id": "089473b3-6053-4734-b86a-3935a1be492f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2061586110-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b756ee34b70d4189a05bf0a0cb751e60", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap614532f0-00", "ovs_interfaceid": "614532f0-0086-4382-b9bd-08c309fc6548", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Aug 30 14:08:04.023134 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Converting VIF {"id": "614532f0-0086-4382-b9bd-08c309fc6548", "address": "fa:16:3e:e5:63:35", "network": {"id": "089473b3-6053-4734-b86a-3935a1be492f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2061586110-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b756ee34b70d4189a05bf0a0cb751e60", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap614532f0-00", "ovs_interfaceid": "614532f0-0086-4382-b9bd-08c309fc6548", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:08:04.024073 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:63:35,bridge_name='br-int',has_traffic_filtering=True,id=614532f0-0086-4382-b9bd-08c309fc6548,network=Network(089473b3-6053-4734-b86a-3935a1be492f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap614532f0-00') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:08:04.024414 np0035104604 nova-compute[107505]: DEBUG os_vif [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:63:35,bridge_name='br-int',has_traffic_filtering=True,id=614532f0-0086-4382-b9bd-08c309fc6548,network=Network(089473b3-6053-4734-b86a-3935a1be492f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap614532f0-00') {{(pid=107505) unplug /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:109}} Aug 30 14:08:04.028942 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:04.029415 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap614532f0-00, bridge=br-int, if_exists=True) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:08:04.032254 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:04.036053 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:08:04.041074 np0035104604 nova-compute[107505]: INFO os_vif [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:63:35,bridge_name='br-int',has_traffic_filtering=True,id=614532f0-0086-4382-b9bd-08c309fc6548,network=Network(089473b3-6053-4734-b86a-3935a1be492f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap614532f0-00') Aug 30 14:08:04.076998 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.952s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:08:04.078302 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-08-30T14:07:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-348819062',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-348819062',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(35),hidden=False,host='np0035104604',hostname='tempest-serverswithspecificflavortestjson-server-348819062',id=10,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=35,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHHdXbSjxXwqDwe7VjknCsPB7FlaX9kbdidTvee1eM5xK4auSl9TOs3ulw+MzaECXIpkK+ZLwKiI+UH3J46BXA6E7xNkh4IyXHMT0+x8g5UZ8EaR+YM4WD80W1OwwUhUEg==',key_name='tempest-keypair-609321772',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b35dffa3f90f40559bd146f98ed3083d',ramdisk_id='',reservation_id='r-uu3dhq4q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-5414259',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-08-30T14:07:57Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d8043c2265984da88bd2252d8b2cf983',uuid=f4820198-6af7-4434-b811-c208d50a5743,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ef896d3c-f7c0-4559-859b-b05102fdb388", "address": "fa:16:3e:95:91:0b", "network": {"id": "33008139-2fd9-496e-ac17-91db64b6f4c6", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1161556951-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b35dffa3f90f40559bd146f98ed3083d", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapef896d3c-f7", "ovs_interfaceid": "ef896d3c-f7c0-4559-859b-b05102fdb388", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=qemu {{(pid=107505) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Aug 30 14:08:04.078752 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Converting VIF {"id": "ef896d3c-f7c0-4559-859b-b05102fdb388", "address": "fa:16:3e:95:91:0b", "network": {"id": "33008139-2fd9-496e-ac17-91db64b6f4c6", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1161556951-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b35dffa3f90f40559bd146f98ed3083d", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapef896d3c-f7", "ovs_interfaceid": "ef896d3c-f7c0-4559-859b-b05102fdb388", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:08:04.079762 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:95:91:0b,bridge_name='br-int',has_traffic_filtering=True,id=ef896d3c-f7c0-4559-859b-b05102fdb388,network=Network(33008139-2fd9-496e-ac17-91db64b6f4c6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef896d3c-f7') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:08:04.080989 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Lazy-loading 'pci_devices' on Instance uuid f4820198-6af7-4434-b811-c208d50a5743 {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: f4820198-6af7-4434-b811-c208d50a5743] End _get_guest_xml xml= Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: f4820198-6af7-4434-b811-c208d50a5743 Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: instance-0000000a Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: 131072 Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: 1 Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: tempest-ServersWithSpecificFlavorTestJSON-server-348819062 Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: 2023-08-30 14:08:01 Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: 128 Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: 1 Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: 0 Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: 0 Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: 1 Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: tempest-ServersWithSpecificFlavorTestJSON-5414259 Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: OpenStack Foundation Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: OpenStack Nova Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: 27.1.0 Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: f4820198-6af7-4434-b811-c208d50a5743 Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: f4820198-6af7-4434-b811-c208d50a5743 Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Virtual Machine Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: hvm Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Nehalem Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: /dev/urandom Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: Aug 30 14:08:04.097117 np0035104604 nova-compute[107505]: {{(pid=107505) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7559}} Aug 30 14:08:04.105299 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: f4820198-6af7-4434-b811-c208d50a5743] Preparing to wait for external event network-vif-plugged-ef896d3c-f7c0-4559-859b-b05102fdb388 {{(pid=107505) prepare_for_instance_event /opt/stack/nova/nova/compute/manager.py:283}} Aug 30 14:08:04.105299 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Acquiring lock "f4820198-6af7-4434-b811-c208d50a5743-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:08:04.105299 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Lock "f4820198-6af7-4434-b811-c208d50a5743-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:08:04.105299 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Lock "f4820198-6af7-4434-b811-c208d50a5743-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:08:04.105299 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-08-30T14:07:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-348819062',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-348819062',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(35),hidden=False,host='np0035104604',hostname='tempest-serverswithspecificflavortestjson-server-348819062',id=10,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=35,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHHdXbSjxXwqDwe7VjknCsPB7FlaX9kbdidTvee1eM5xK4auSl9TOs3ulw+MzaECXIpkK+ZLwKiI+UH3J46BXA6E7xNkh4IyXHMT0+x8g5UZ8EaR+YM4WD80W1OwwUhUEg==',key_name='tempest-keypair-609321772',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b35dffa3f90f40559bd146f98ed3083d',ramdisk_id='',reservation_id='r-uu3dhq4q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-5414259',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-08-30T14:07:57Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d8043c2265984da88bd2252d8b2cf983',uuid=f4820198-6af7-4434-b811-c208d50a5743,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ef896d3c-f7c0-4559-859b-b05102fdb388", "address": "fa:16:3e:95:91:0b", "network": {"id": "33008139-2fd9-496e-ac17-91db64b6f4c6", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1161556951-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b35dffa3f90f40559bd146f98ed3083d", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapef896d3c-f7", "ovs_interfaceid": "ef896d3c-f7c0-4559-859b-b05102fdb388", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Aug 30 14:08:04.106477 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Converting VIF {"id": "ef896d3c-f7c0-4559-859b-b05102fdb388", "address": "fa:16:3e:95:91:0b", "network": {"id": "33008139-2fd9-496e-ac17-91db64b6f4c6", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1161556951-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b35dffa3f90f40559bd146f98ed3083d", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapef896d3c-f7", "ovs_interfaceid": "ef896d3c-f7c0-4559-859b-b05102fdb388", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:08:04.106477 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:95:91:0b,bridge_name='br-int',has_traffic_filtering=True,id=ef896d3c-f7c0-4559-859b-b05102fdb388,network=Network(33008139-2fd9-496e-ac17-91db64b6f4c6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef896d3c-f7') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:08:04.106477 np0035104604 nova-compute[107505]: DEBUG os_vif [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:95:91:0b,bridge_name='br-int',has_traffic_filtering=True,id=ef896d3c-f7c0-4559-859b-b05102fdb388,network=Network(33008139-2fd9-496e-ac17-91db64b6f4c6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef896d3c-f7') {{(pid=107505) plug /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:76}} Aug 30 14:08:04.106477 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:04.106477 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:08:04.106477 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Aug 30 14:08:04.109932 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:04.110389 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapef896d3c-f7, may_exist=True, interface_attrs={}) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:08:04.111119 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapef896d3c-f7, col_values=(('external_ids', {'iface-id': 'ef896d3c-f7c0-4559-859b-b05102fdb388', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:95:91:0b', 'vm-uuid': 'f4820198-6af7-4434-b811-c208d50a5743'}),), if_exists=True) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:08:04.112865 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:04.119290 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-20811911-584a-4a16-9c3e-a70d042dc67f req-5bb4c928-ce59-4958-8f12-450ac191bfa7 service nova] [instance: f4820198-6af7-4434-b811-c208d50a5743] Updated VIF entry in instance network info cache for port ef896d3c-f7c0-4559-859b-b05102fdb388. {{(pid=107505) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} Aug 30 14:08:04.120136 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-20811911-584a-4a16-9c3e-a70d042dc67f req-5bb4c928-ce59-4958-8f12-450ac191bfa7 service nova] [instance: f4820198-6af7-4434-b811-c208d50a5743] Updating instance_info_cache with network_info: [{"id": "ef896d3c-f7c0-4559-859b-b05102fdb388", "address": "fa:16:3e:95:91:0b", "network": {"id": "33008139-2fd9-496e-ac17-91db64b6f4c6", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1161556951-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b35dffa3f90f40559bd146f98ed3083d", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapef896d3c-f7", "ovs_interfaceid": "ef896d3c-f7c0-4559-859b-b05102fdb388", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:08:04.123638 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:08:04.127120 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:04.129818 np0035104604 nova-compute[107505]: INFO os_vif [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:95:91:0b,bridge_name='br-int',has_traffic_filtering=True,id=ef896d3c-f7c0-4559-859b-b05102fdb388,network=Network(33008139-2fd9-496e-ac17-91db64b6f4c6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef896d3c-f7') Aug 30 14:08:04.138795 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-20811911-584a-4a16-9c3e-a70d042dc67f req-5bb4c928-ce59-4958-8f12-450ac191bfa7 service nova] Releasing lock "refresh_cache-f4820198-6af7-4434-b811-c208d50a5743" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:08:04.192416 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] No BDM found with device name vda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:08:04.192877 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] No BDM found with device name hda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:08:04.193326 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] No VIF found with MAC fa:16:3e:95:91:0b, not building metadata {{(pid=107505) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12068}} Aug 30 14:08:04.194491 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: f4820198-6af7-4434-b811-c208d50a5743] Using config drive Aug 30 14:08:04.233487 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] rbd image f4820198-6af7-4434-b811-c208d50a5743_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:08:04.621648 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Deleting instance files /opt/stack/data/nova/instances/3e4eb957-d13b-4d8c-be3b-bad1d61114fe_del Aug 30 14:08:04.622371 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Deletion of /opt/stack/data/nova/instances/3e4eb957-d13b-4d8c-be3b-bad1d61114fe_del complete Aug 30 14:08:04.663129 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: f4820198-6af7-4434-b811-c208d50a5743] Creating config drive at /opt/stack/data/nova/instances/f4820198-6af7-4434-b811-c208d50a5743/disk.config Aug 30 14:08:04.668803 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Running cmd (subprocess): genisoimage -o /opt/stack/data/nova/instances/f4820198-6af7-4434-b811-c208d50a5743/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 27.1.0 -quiet -J -r -V config-2 /tmp/tmpuyf24_rh {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:08:04.708445 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] CMD "genisoimage -o /opt/stack/data/nova/instances/f4820198-6af7-4434-b811-c208d50a5743/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 27.1.0 -quiet -J -r -V config-2 /tmp/tmpuyf24_rh" returned: 0 in 0.040s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:08:04.743610 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] rbd image f4820198-6af7-4434-b811-c208d50a5743_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:08:04.748609 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/f4820198-6af7-4434-b811-c208d50a5743/disk.config f4820198-6af7-4434-b811-c208d50a5743_disk.config --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:08:04.837987 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Creating instance directory {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4717}} Aug 30 14:08:04.838763 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Creating image(s) Aug 30 14:08:04.876510 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] rbd image 3e4eb957-d13b-4d8c-be3b-bad1d61114fe_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:08:04.906228 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] rbd image 3e4eb957-d13b-4d8c-be3b-bad1d61114fe_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:08:04.936387 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] rbd image 3e4eb957-d13b-4d8c-be3b-bad1d61114fe_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:08:04.941179 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d279ce3c0a0edcb9bee120b54ec031d664e9e8d4 --force-share --output=json {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:08:04.959038 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] CMD "rbd import --pool vms /opt/stack/data/nova/instances/f4820198-6af7-4434-b811-c208d50a5743/disk.config f4820198-6af7-4434-b811-c208d50a5743_disk.config --image-format=2 --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.210s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:08:04.959652 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: f4820198-6af7-4434-b811-c208d50a5743] Deleting local config drive /opt/stack/data/nova/instances/f4820198-6af7-4434-b811-c208d50a5743/disk.config because it was imported into RBD. Aug 30 14:08:04.977918 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:05.035601 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CMD "/opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d279ce3c0a0edcb9bee120b54ec031d664e9e8d4 --force-share --output=json" returned: 0 in 0.093s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:08:05.035601 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Acquiring lock "d279ce3c0a0edcb9bee120b54ec031d664e9e8d4" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:08:05.035601 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "d279ce3c0a0edcb9bee120b54ec031d664e9e8d4" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:08:05.039090 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "d279ce3c0a0edcb9bee120b54ec031d664e9e8d4" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:08:05.083035 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] rbd image 3e4eb957-d13b-4d8c-be3b-bad1d61114fe_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:08:05.083035 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/_base/d279ce3c0a0edcb9bee120b54ec031d664e9e8d4 3e4eb957-d13b-4d8c-be3b-bad1d61114fe_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:08:05.362507 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-f9e47a89-74ee-4a28-bfc2-d761e79b1e21 req-a71556b5-b878-45ad-9b90-f1328b0ca221 service nova] [instance: f4820198-6af7-4434-b811-c208d50a5743] Received event network-vif-plugged-ef896d3c-f7c0-4559-859b-b05102fdb388 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:08:05.363307 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-f9e47a89-74ee-4a28-bfc2-d761e79b1e21 req-a71556b5-b878-45ad-9b90-f1328b0ca221 service nova] Acquiring lock "f4820198-6af7-4434-b811-c208d50a5743-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:08:05.363667 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-f9e47a89-74ee-4a28-bfc2-d761e79b1e21 req-a71556b5-b878-45ad-9b90-f1328b0ca221 service nova] Lock "f4820198-6af7-4434-b811-c208d50a5743-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:08:05.365679 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-f9e47a89-74ee-4a28-bfc2-d761e79b1e21 req-a71556b5-b878-45ad-9b90-f1328b0ca221 service nova] Lock "f4820198-6af7-4434-b811-c208d50a5743-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:08:05.365679 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-f9e47a89-74ee-4a28-bfc2-d761e79b1e21 req-a71556b5-b878-45ad-9b90-f1328b0ca221 service nova] [instance: f4820198-6af7-4434-b811-c208d50a5743] Processing event network-vif-plugged-ef896d3c-f7c0-4559-859b-b05102fdb388 {{(pid=107505) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10792}} Aug 30 14:08:05.468566 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:05.485066 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:05.503556 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:05.554265 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CMD "rbd import --pool vms /opt/stack/data/nova/instances/_base/d279ce3c0a0edcb9bee120b54ec031d664e9e8d4 3e4eb957-d13b-4d8c-be3b-bad1d61114fe_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:08:05.686877 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] resizing rbd image 3e4eb957-d13b-4d8c-be3b-bad1d61114fe_disk to 1073741824 {{(pid=107505) resize /opt/stack/nova/nova/storage/rbd_utils.py:288}} Aug 30 14:08:05.790429 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Created local disks {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4851}} Aug 30 14:08:05.790623 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Ensure instance console log exists: /opt/stack/data/nova/instances/3e4eb957-d13b-4d8c-be3b-bad1d61114fe/console.log {{(pid=107505) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4603}} Aug 30 14:08:05.790997 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:08:05.791538 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:08:05.791820 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:08:05.794597 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Start _get_guest_xml network_info=[{"id": "614532f0-0086-4382-b9bd-08c309fc6548", "address": "fa:16:3e:e5:63:35", "network": {"id": "089473b3-6053-4734-b86a-3935a1be492f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2061586110-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b756ee34b70d4189a05bf0a0cb751e60", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap614532f0-00", "ovs_interfaceid": "614532f0-0086-4382-b9bd-08c309fc6548", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'ide', 'dev': 'hda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='0c839612eb3f2469420f2ccae990827f',container_format='bare',created_at=2023-08-30T14:01:42Z,direct_url=,disk_format='qcow2',id=bd27109d-8758-41af-af44-0e764a2addca,min_disk=0,min_ram=0,name='cirros-0.6.1-x86_64-disk',owner='7f1ed4be4a61468fa9c662d2b5fbcb56',properties=ImageMetaProps,protected=,size=21233664,status='active',tags=,updated_at=2023-08-30T14:01:49Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'disk_bus': 'virtio', 'image_id': 'a32d892d-d3ba-4b7e-9bab-7e06f730b9e0'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=107505) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7553}} Aug 30 14:08:05.808604 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError Aug 30 14:08:05.813550 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Searching host: 'np0035104604' for CPU controller through CGroups V1... {{(pid=107505) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1650}} Aug 30 14:08:05.814430 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CPU controller missing on host. {{(pid=107505) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1660}} Aug 30 14:08:05.817410 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Searching host: 'np0035104604' for CPU controller through CGroups V2... {{(pid=107505) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1669}} Aug 30 14:08:05.817884 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CPU controller found on host. {{(pid=107505) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1676}} Aug 30 14:08:05.819603 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=107505) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5390}} Aug 30 14:08:05.820182 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-08-30T14:02:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='0c839612eb3f2469420f2ccae990827f',container_format='bare',created_at=2023-08-30T14:01:42Z,direct_url=,disk_format='qcow2',id=bd27109d-8758-41af-af44-0e764a2addca,min_disk=0,min_ram=0,name='cirros-0.6.1-x86_64-disk',owner='7f1ed4be4a61468fa9c662d2b5fbcb56',properties=ImageMetaProps,protected=,size=21233664,status='active',tags=,updated_at=2023-08-30T14:01:49Z,virtual_size=,visibility=), allow threads: True {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Aug 30 14:08:05.820582 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Flavor limits 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Aug 30 14:08:05.820971 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Image limits 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Aug 30 14:08:05.821144 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Flavor pref 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Aug 30 14:08:05.821483 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Image pref 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Aug 30 14:08:05.821977 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Aug 30 14:08:05.822300 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Aug 30 14:08:05.822634 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=107505) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Aug 30 14:08:05.822945 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Got 1 possible topologies {{(pid=107505) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Aug 30 14:08:05.823209 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Aug 30 14:08:05.823480 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Aug 30 14:08:05.823847 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lazy-loading 'vcpu_model' on Instance uuid 3e4eb957-d13b-4d8c-be3b-bad1d61114fe {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:08:05.864670 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:08:06.073151 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Started> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:08:06.073612 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: f4820198-6af7-4434-b811-c208d50a5743] VM Started (Lifecycle Event) Aug 30 14:08:06.081950 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-304bb6ec-5c3a-4c61-a217-c1a27857b1ef req-5ce5c656-b50d-4139-b56f-6706e01655c8 service nova] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Received event network-vif-unplugged-614532f0-0086-4382-b9bd-08c309fc6548 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:08:06.082455 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-304bb6ec-5c3a-4c61-a217-c1a27857b1ef req-5ce5c656-b50d-4139-b56f-6706e01655c8 service nova] Acquiring lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:08:06.083010 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-304bb6ec-5c3a-4c61-a217-c1a27857b1ef req-5ce5c656-b50d-4139-b56f-6706e01655c8 service nova] Lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:08:06.085047 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-304bb6ec-5c3a-4c61-a217-c1a27857b1ef req-5ce5c656-b50d-4139-b56f-6706e01655c8 service nova] Lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.002s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:08:06.085569 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-304bb6ec-5c3a-4c61-a217-c1a27857b1ef req-5ce5c656-b50d-4139-b56f-6706e01655c8 service nova] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] No waiting events found dispatching network-vif-unplugged-614532f0-0086-4382-b9bd-08c309fc6548 {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:08:06.086017 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [req-304bb6ec-5c3a-4c61-a217-c1a27857b1ef req-5ce5c656-b50d-4139-b56f-6706e01655c8 service nova] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Received unexpected event network-vif-unplugged-614532f0-0086-4382-b9bd-08c309fc6548 for instance with vm_state error and task_state rebuild_spawning. Aug 30 14:08:06.086980 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: f4820198-6af7-4434-b811-c208d50a5743] Instance event wait completed in 0 seconds for network-vif-plugged {{(pid=107505) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Aug 30 14:08:06.103134 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: f4820198-6af7-4434-b811-c208d50a5743] Guest created on hypervisor {{(pid=107505) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4411}} Aug 30 14:08:06.104276 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: f4820198-6af7-4434-b811-c208d50a5743] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:08:06.115767 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: f4820198-6af7-4434-b811-c208d50a5743] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:08:06.120264 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: f4820198-6af7-4434-b811-c208d50a5743] Instance spawned successfully. Aug 30 14:08:06.121779 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: f4820198-6af7-4434-b811-c208d50a5743] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:894}} Aug 30 14:08:06.141572 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: f4820198-6af7-4434-b811-c208d50a5743] During sync_power_state the instance has a pending task (spawning). Skip. Aug 30 14:08:06.142128 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Paused> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:08:06.142558 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: f4820198-6af7-4434-b811-c208d50a5743] VM Paused (Lifecycle Event) Aug 30 14:08:06.167075 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: f4820198-6af7-4434-b811-c208d50a5743] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:08:06.170581 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: f4820198-6af7-4434-b811-c208d50a5743] Found default for hw_cdrom_bus of ide {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:08:06.171059 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: f4820198-6af7-4434-b811-c208d50a5743] Found default for hw_disk_bus of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:08:06.171997 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: f4820198-6af7-4434-b811-c208d50a5743] Found default for hw_input_bus of None {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:08:06.172741 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: f4820198-6af7-4434-b811-c208d50a5743] Found default for hw_pointer_model of None {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:08:06.173830 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: f4820198-6af7-4434-b811-c208d50a5743] Found default for hw_video_model of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:08:06.176083 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: f4820198-6af7-4434-b811-c208d50a5743] Found default for hw_vif_model of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:08:06.195750 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Resumed> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:08:06.196176 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: f4820198-6af7-4434-b811-c208d50a5743] VM Resumed (Lifecycle Event) Aug 30 14:08:06.220774 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: f4820198-6af7-4434-b811-c208d50a5743] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:08:06.232847 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: f4820198-6af7-4434-b811-c208d50a5743] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:08:06.286009 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: f4820198-6af7-4434-b811-c208d50a5743] During sync_power_state the instance has a pending task (spawning). Skip. Aug 30 14:08:06.309048 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: f4820198-6af7-4434-b811-c208d50a5743] Took 9.41 seconds to spawn the instance on the hypervisor. Aug 30 14:08:06.309803 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: f4820198-6af7-4434-b811-c208d50a5743] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:08:06.433156 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: f4820198-6af7-4434-b811-c208d50a5743] Took 12.71 seconds to build instance. Aug 30 14:08:06.452718 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-e005eba6-ccb3-4d8d-af10-dbc48bc764d2 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Lock "f4820198-6af7-4434-b811-c208d50a5743" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 12.865s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:08:06.750596 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.886s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:08:06.780471 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] rbd image 3e4eb957-d13b-4d8c-be3b-bad1d61114fe_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:08:06.788413 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:08:07.026669 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager.update_available_resource {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:08:07.072696 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:08:07.073425 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:08:07.074037 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:08:07.074624 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Auditing locally available compute resources for np0035104604 (node: np0035104604) {{(pid=107505) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} Aug 30 14:08:07.075542 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:08:07.579612 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-5bdc5630-7382-494a-ac14-da345e62234d req-1b346211-c339-4b85-a436-d434d0e4580a service nova] [instance: f4820198-6af7-4434-b811-c208d50a5743] Received event network-vif-plugged-ef896d3c-f7c0-4559-859b-b05102fdb388 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:08:07.581569 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-5bdc5630-7382-494a-ac14-da345e62234d req-1b346211-c339-4b85-a436-d434d0e4580a service nova] Acquiring lock "f4820198-6af7-4434-b811-c208d50a5743-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:08:07.582475 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-5bdc5630-7382-494a-ac14-da345e62234d req-1b346211-c339-4b85-a436-d434d0e4580a service nova] Lock "f4820198-6af7-4434-b811-c208d50a5743-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:08:07.583283 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-5bdc5630-7382-494a-ac14-da345e62234d req-1b346211-c339-4b85-a436-d434d0e4580a service nova] Lock "f4820198-6af7-4434-b811-c208d50a5743-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:08:07.584417 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-5bdc5630-7382-494a-ac14-da345e62234d req-1b346211-c339-4b85-a436-d434d0e4580a service nova] [instance: f4820198-6af7-4434-b811-c208d50a5743] No waiting events found dispatching network-vif-plugged-ef896d3c-f7c0-4559-859b-b05102fdb388 {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:08:07.585531 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [req-5bdc5630-7382-494a-ac14-da345e62234d req-1b346211-c339-4b85-a436-d434d0e4580a service nova] [instance: f4820198-6af7-4434-b811-c208d50a5743] Received unexpected event network-vif-plugged-ef896d3c-f7c0-4559-859b-b05102fdb388 for instance with vm_state active and task_state None. Aug 30 14:08:07.695836 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.907s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:08:07.699548 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=1,config_drive='True',created_at=2023-08-30T14:07:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1551096224',display_name='tempest-ServersAdminTestJSON-server-1551096224',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-serversadmintestjson-server-1551096224',id=7,image_ref='bd27109d-8758-41af-af44-0e764a2addca',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-08-30T14:07:34Z,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b756ee34b70d4189a05bf0a0cb751e60',ramdisk_id='',reservation_id='r-mntu6p6e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='bd27109d-8758-41af-af44-0e764a2addca',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.1-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServersAdminTestJSON-499476127',owner_user_name='tempest-ServersAdminTestJSON-499476127-project-member'},tags=,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2023-08-30T14:08:05Z,user_data=None,user_id='5a6677b100d348b69575a22d363e18f4',uuid=3e4eb957-d13b-4d8c-be3b-bad1d61114fe,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='error') vif={"id": "614532f0-0086-4382-b9bd-08c309fc6548", "address": "fa:16:3e:e5:63:35", "network": {"id": "089473b3-6053-4734-b86a-3935a1be492f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2061586110-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b756ee34b70d4189a05bf0a0cb751e60", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap614532f0-00", "ovs_interfaceid": "614532f0-0086-4382-b9bd-08c309fc6548", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=qemu {{(pid=107505) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Aug 30 14:08:07.700779 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Converting VIF {"id": "614532f0-0086-4382-b9bd-08c309fc6548", "address": "fa:16:3e:e5:63:35", "network": {"id": "089473b3-6053-4734-b86a-3935a1be492f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2061586110-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b756ee34b70d4189a05bf0a0cb751e60", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap614532f0-00", "ovs_interfaceid": "614532f0-0086-4382-b9bd-08c309fc6548", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:08:07.702648 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:63:35,bridge_name='br-int',has_traffic_filtering=True,id=614532f0-0086-4382-b9bd-08c309fc6548,network=Network(089473b3-6053-4734-b86a-3935a1be492f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap614532f0-00') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] End _get_guest_xml xml= Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: instance-00000007 Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: 131072 Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: 1 Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: tempest-ServersAdminTestJSON-server-1551096224 Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: 2023-08-30 14:08:05 Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: 128 Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: 1 Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: 0 Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: 0 Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: 1 Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: tempest-ServersAdminTestJSON-499476127-project-member Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: tempest-ServersAdminTestJSON-499476127 Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: OpenStack Foundation Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: OpenStack Nova Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: 27.1.0 Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Virtual Machine Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: hvm Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Nehalem Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: /dev/urandom Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: Aug 30 14:08:07.708222 np0035104604 nova-compute[107505]: {{(pid=107505) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7559}} Aug 30 14:08:07.717315 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Preparing to wait for external event network-vif-plugged-614532f0-0086-4382-b9bd-08c309fc6548 {{(pid=107505) prepare_for_instance_event /opt/stack/nova/nova/compute/manager.py:283}} Aug 30 14:08:07.718070 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Acquiring lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:08:07.718601 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:08:07.719098 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:08:07.721225 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=1,config_drive='True',created_at=2023-08-30T14:07:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1551096224',display_name='tempest-ServersAdminTestJSON-server-1551096224',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-serversadmintestjson-server-1551096224',id=7,image_ref='bd27109d-8758-41af-af44-0e764a2addca',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-08-30T14:07:34Z,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b756ee34b70d4189a05bf0a0cb751e60',ramdisk_id='',reservation_id='r-mntu6p6e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='bd27109d-8758-41af-af44-0e764a2addca',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.1-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServersAdminTestJSON-499476127',owner_user_name='tempest-ServersAdminTestJSON-499476127-project-member'},tags=,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2023-08-30T14:08:05Z,user_data=None,user_id='5a6677b100d348b69575a22d363e18f4',uuid=3e4eb957-d13b-4d8c-be3b-bad1d61114fe,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='error') vif={"id": "614532f0-0086-4382-b9bd-08c309fc6548", "address": "fa:16:3e:e5:63:35", "network": {"id": "089473b3-6053-4734-b86a-3935a1be492f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2061586110-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b756ee34b70d4189a05bf0a0cb751e60", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap614532f0-00", "ovs_interfaceid": "614532f0-0086-4382-b9bd-08c309fc6548", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Aug 30 14:08:07.722707 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Converting VIF {"id": "614532f0-0086-4382-b9bd-08c309fc6548", "address": "fa:16:3e:e5:63:35", "network": {"id": "089473b3-6053-4734-b86a-3935a1be492f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2061586110-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b756ee34b70d4189a05bf0a0cb751e60", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap614532f0-00", "ovs_interfaceid": "614532f0-0086-4382-b9bd-08c309fc6548", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:08:07.724309 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:63:35,bridge_name='br-int',has_traffic_filtering=True,id=614532f0-0086-4382-b9bd-08c309fc6548,network=Network(089473b3-6053-4734-b86a-3935a1be492f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap614532f0-00') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:08:07.725325 np0035104604 nova-compute[107505]: DEBUG os_vif [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:63:35,bridge_name='br-int',has_traffic_filtering=True,id=614532f0-0086-4382-b9bd-08c309fc6548,network=Network(089473b3-6053-4734-b86a-3935a1be492f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap614532f0-00') {{(pid=107505) plug /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:76}} Aug 30 14:08:07.727047 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:07.727872 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:08:07.729170 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Aug 30 14:08:07.739360 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:07.739869 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap614532f0-00, may_exist=True, interface_attrs={}) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:08:07.741555 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap614532f0-00, col_values=(('external_ids', {'iface-id': '614532f0-0086-4382-b9bd-08c309fc6548', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e5:63:35', 'vm-uuid': '3e4eb957-d13b-4d8c-be3b-bad1d61114fe'}),), if_exists=True) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:08:07.747612 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:07.752978 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:08:07.755549 np0035104604 nova-compute[107505]: INFO os_vif [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:63:35,bridge_name='br-int',has_traffic_filtering=True,id=614532f0-0086-4382-b9bd-08c309fc6548,network=Network(089473b3-6053-4734-b86a-3935a1be492f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap614532f0-00') Aug 30 14:08:07.812480 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] No BDM found with device name vda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:08:07.813853 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] No BDM found with device name hda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:08:07.814323 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] No VIF found with MAC fa:16:3e:e5:63:35, not building metadata {{(pid=107505) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12068}} Aug 30 14:08:07.816402 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Using config drive Aug 30 14:08:07.869818 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] rbd image 3e4eb957-d13b-4d8c-be3b-bad1d61114fe_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:08:07.901926 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lazy-loading 'ec2_ids' on Instance uuid 3e4eb957-d13b-4d8c-be3b-bad1d61114fe {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:08:07.918376 np0035104604 nova-compute[107505]: DEBUG nova.objects.base [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Object Instance<3e4eb957-d13b-4d8c-be3b-bad1d61114fe> lazy-loaded attributes: vcpu_model,ec2_ids {{(pid=107505) wrapper /opt/stack/nova/nova/objects/base.py:126}} Aug 30 14:08:07.920555 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lazy-loading 'keypairs' on Instance uuid 3e4eb957-d13b-4d8c-be3b-bad1d61114fe {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:08:07.942726 np0035104604 nova-compute[107505]: DEBUG nova.objects.base [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Object Instance<3e4eb957-d13b-4d8c-be3b-bad1d61114fe> lazy-loaded attributes: vcpu_model,ec2_ids,keypairs {{(pid=107505) wrapper /opt/stack/nova/nova/objects/base.py:126}} Aug 30 14:08:08.150276 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 1.075s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:08:08.259810 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-0000000a as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:08:08.260746 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-0000000a as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:08:08.265665 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000009 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:08:08.266160 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000009 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:08:08.270457 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000007 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:08:08.270813 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000007 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:08:08.301494 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000008 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:08:08.301891 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000008 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:08:08.602374 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-3648699b-94c5-4dd1-bf35-41715fa52573 req-c217c238-8b03-498d-b96f-496636a80a9c service nova] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Received event network-vif-plugged-614532f0-0086-4382-b9bd-08c309fc6548 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:08:08.602858 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-3648699b-94c5-4dd1-bf35-41715fa52573 req-c217c238-8b03-498d-b96f-496636a80a9c service nova] Acquiring lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:08:08.603864 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-3648699b-94c5-4dd1-bf35-41715fa52573 req-c217c238-8b03-498d-b96f-496636a80a9c service nova] Lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:08:08.603864 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-3648699b-94c5-4dd1-bf35-41715fa52573 req-c217c238-8b03-498d-b96f-496636a80a9c service nova] Lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:08:08.605101 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-3648699b-94c5-4dd1-bf35-41715fa52573 req-c217c238-8b03-498d-b96f-496636a80a9c service nova] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Processing event network-vif-plugged-614532f0-0086-4382-b9bd-08c309fc6548 {{(pid=107505) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10792}} Aug 30 14:08:08.607757 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Aug 30 14:08:08.609825 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Hypervisor/Node resource view: name=np0035104604 free_ram=961MB free_disk=29.873008728027344GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "00b8", "vendor_id": "1013", "numa_node": null, "label": "label_1013_00b8", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] {{(pid=107505) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} Aug 30 14:08:08.610264 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:08:08.610782 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:08:08.676771 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Creating config drive at /opt/stack/data/nova/instances/3e4eb957-d13b-4d8c-be3b-bad1d61114fe/disk.config Aug 30 14:08:08.682530 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Running cmd (subprocess): genisoimage -o /opt/stack/data/nova/instances/3e4eb957-d13b-4d8c-be3b-bad1d61114fe/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 27.1.0 -quiet -J -r -V config-2 /tmp/tmpedyv5vfi {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:08:08.705321 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:08.839848 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CMD "genisoimage -o /opt/stack/data/nova/instances/3e4eb957-d13b-4d8c-be3b-bad1d61114fe/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 27.1.0 -quiet -J -r -V config-2 /tmp/tmpedyv5vfi" returned: 0 in 0.157s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:08:08.871793 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] rbd image 3e4eb957-d13b-4d8c-be3b-bad1d61114fe_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:08:08.875386 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/3e4eb957-d13b-4d8c-be3b-bad1d61114fe/disk.config 3e4eb957-d13b-4d8c-be3b-bad1d61114fe_disk.config --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:08:08.931511 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance 3e4eb957-d13b-4d8c-be3b-bad1d61114fe actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:08:08.931891 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance 8f0941f9-96cc-445a-9d3b-83aa415ff950 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:08:08.932669 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:08:08.933375 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance f4820198-6af7-4434-b811-c208d50a5743 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:08:08.933809 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Total usable vcpus: 8, total allocated vcpus: 4 {{(pid=107505) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} Aug 30 14:08:08.934517 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Final resource view: name=np0035104604 phys_ram=7746MB used_ram=1024MB phys_disk=29GB used_disk=4GB total_vcpus=8 used_vcpus=4 pci_stats=[] {{(pid=107505) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} Aug 30 14:08:09.108939 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CMD "rbd import --pool vms /opt/stack/data/nova/instances/3e4eb957-d13b-4d8c-be3b-bad1d61114fe/disk.config 3e4eb957-d13b-4d8c-be3b-bad1d61114fe_disk.config --image-format=2 --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.234s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:08:09.112283 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Deleting local config drive /opt/stack/data/nova/instances/3e4eb957-d13b-4d8c-be3b-bad1d61114fe/disk.config because it was imported into RBD. Aug 30 14:08:09.292283 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:09.472264 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:09.860358 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:08:10.353672 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Removed pending event for 3e4eb957-d13b-4d8c-be3b-bad1d61114fe due to event {{(pid=107505) _event_emit_delayed /opt/stack/nova/nova/virt/libvirt/host.py:438}} Aug 30 14:08:10.354401 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Started> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:08:10.354878 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] VM Started (Lifecycle Event) Aug 30 14:08:10.359831 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Instance event wait completed in 0 seconds for network-vif-plugged {{(pid=107505) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Aug 30 14:08:10.372075 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Guest created on hypervisor {{(pid=107505) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4411}} Aug 30 14:08:10.376776 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-36407e1b-8670-4e85-836d-e74f71b84572 req-507a5e3e-1beb-4e01-a83a-455eddb325b2 service nova] [instance: f4820198-6af7-4434-b811-c208d50a5743] Received event network-changed-ef896d3c-f7c0-4559-859b-b05102fdb388 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:08:10.376776 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-36407e1b-8670-4e85-836d-e74f71b84572 req-507a5e3e-1beb-4e01-a83a-455eddb325b2 service nova] [instance: f4820198-6af7-4434-b811-c208d50a5743] Refreshing instance network info cache due to event network-changed-ef896d3c-f7c0-4559-859b-b05102fdb388. {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11037}} Aug 30 14:08:10.376776 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-36407e1b-8670-4e85-836d-e74f71b84572 req-507a5e3e-1beb-4e01-a83a-455eddb325b2 service nova] Acquiring lock "refresh_cache-f4820198-6af7-4434-b811-c208d50a5743" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:08:10.376776 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-36407e1b-8670-4e85-836d-e74f71b84572 req-507a5e3e-1beb-4e01-a83a-455eddb325b2 service nova] Acquired lock "refresh_cache-f4820198-6af7-4434-b811-c208d50a5743" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:08:10.376776 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-36407e1b-8670-4e85-836d-e74f71b84572 req-507a5e3e-1beb-4e01-a83a-455eddb325b2 service nova] [instance: f4820198-6af7-4434-b811-c208d50a5743] Refreshing network info cache for port ef896d3c-f7c0-4559-859b-b05102fdb388 {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} Aug 30 14:08:10.385900 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:08:10.392636 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Instance spawned successfully. Aug 30 14:08:10.393409 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:894}} Aug 30 14:08:10.554295 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Synchronizing instance power state after lifecycle event "Started"; current vm_state: error, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:08:10.576077 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Found default for hw_cdrom_bus of ide {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:08:10.576470 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Found default for hw_disk_bus of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:08:10.577282 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Found default for hw_input_bus of None {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:08:10.577965 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Found default for hw_pointer_model of None {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:08:10.584209 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Found default for hw_video_model of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:08:10.584209 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Found default for hw_vif_model of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:08:10.636558 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] During sync_power_state the instance has a pending task (rebuild_spawning). Skip. Aug 30 14:08:10.637317 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Paused> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:08:10.637658 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] VM Paused (Lifecycle Event) Aug 30 14:08:10.659512 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:08:10.666314 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Resumed> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:08:10.666674 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] VM Resumed (Lifecycle Event) Aug 30 14:08:10.690685 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-af3b936d-f29b-4bb0-997d-c22d8c8314cc req-655c5969-7723-4fc1-9747-9983bcd1e274 service nova] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Received event network-vif-plugged-614532f0-0086-4382-b9bd-08c309fc6548 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:08:10.690685 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-af3b936d-f29b-4bb0-997d-c22d8c8314cc req-655c5969-7723-4fc1-9747-9983bcd1e274 service nova] Acquiring lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:08:10.691139 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-af3b936d-f29b-4bb0-997d-c22d8c8314cc req-655c5969-7723-4fc1-9747-9983bcd1e274 service nova] Lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:08:10.691544 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-af3b936d-f29b-4bb0-997d-c22d8c8314cc req-655c5969-7723-4fc1-9747-9983bcd1e274 service nova] Lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:08:10.691940 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-af3b936d-f29b-4bb0-997d-c22d8c8314cc req-655c5969-7723-4fc1-9747-9983bcd1e274 service nova] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] No waiting events found dispatching network-vif-plugged-614532f0-0086-4382-b9bd-08c309fc6548 {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:08:10.692328 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [req-af3b936d-f29b-4bb0-997d-c22d8c8314cc req-655c5969-7723-4fc1-9747-9983bcd1e274 service nova] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Received unexpected event network-vif-plugged-614532f0-0086-4382-b9bd-08c309fc6548 for instance with vm_state error and task_state rebuild_spawning. Aug 30 14:08:10.692774 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-af3b936d-f29b-4bb0-997d-c22d8c8314cc req-655c5969-7723-4fc1-9747-9983bcd1e274 service nova] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Received event network-vif-plugged-614532f0-0086-4382-b9bd-08c309fc6548 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:08:10.693088 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-af3b936d-f29b-4bb0-997d-c22d8c8314cc req-655c5969-7723-4fc1-9747-9983bcd1e274 service nova] Acquiring lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:08:10.693528 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-af3b936d-f29b-4bb0-997d-c22d8c8314cc req-655c5969-7723-4fc1-9747-9983bcd1e274 service nova] Lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:08:10.694122 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-af3b936d-f29b-4bb0-997d-c22d8c8314cc req-655c5969-7723-4fc1-9747-9983bcd1e274 service nova] Lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:08:10.694524 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-af3b936d-f29b-4bb0-997d-c22d8c8314cc req-655c5969-7723-4fc1-9747-9983bcd1e274 service nova] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] No waiting events found dispatching network-vif-plugged-614532f0-0086-4382-b9bd-08c309fc6548 {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:08:10.694924 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [req-af3b936d-f29b-4bb0-997d-c22d8c8314cc req-655c5969-7723-4fc1-9747-9983bcd1e274 service nova] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Received unexpected event network-vif-plugged-614532f0-0086-4382-b9bd-08c309fc6548 for instance with vm_state error and task_state rebuild_spawning. Aug 30 14:08:10.852225 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.992s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:08:10.852531 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:08:10.854607 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:08:10.859541 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: error, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:08:10.874402 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:08:10.912052 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] During sync_power_state the instance has a pending task (rebuild_spawning). Skip. Aug 30 14:08:10.927696 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:08:11.089188 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Compute_service record updated for np0035104604:np0035104604 {{(pid=107505) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} Aug 30 14:08:11.089707 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.479s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:08:11.090663 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:08:11.091120 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:08:11.091521 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Trying to apply a migration context that does not seem to be set for this instance {{(pid=107505) apply_migration_context /opt/stack/nova/nova/objects/instance.py:1078}} Aug 30 14:08:11.289754 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-5c464ac6-1827-45d2-b7d8-f0b358185e2e tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.198s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:08:12.132474 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-36407e1b-8670-4e85-836d-e74f71b84572 req-507a5e3e-1beb-4e01-a83a-455eddb325b2 service nova] [instance: f4820198-6af7-4434-b811-c208d50a5743] Updated VIF entry in instance network info cache for port ef896d3c-f7c0-4559-859b-b05102fdb388. {{(pid=107505) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} Aug 30 14:08:12.133180 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-36407e1b-8670-4e85-836d-e74f71b84572 req-507a5e3e-1beb-4e01-a83a-455eddb325b2 service nova] [instance: f4820198-6af7-4434-b811-c208d50a5743] Updating instance_info_cache with network_info: [{"id": "ef896d3c-f7c0-4559-859b-b05102fdb388", "address": "fa:16:3e:95:91:0b", "network": {"id": "33008139-2fd9-496e-ac17-91db64b6f4c6", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1161556951-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.5.10", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b35dffa3f90f40559bd146f98ed3083d", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapef896d3c-f7", "ovs_interfaceid": "ef896d3c-f7c0-4559-859b-b05102fdb388", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:08:12.160229 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-36407e1b-8670-4e85-836d-e74f71b84572 req-507a5e3e-1beb-4e01-a83a-455eddb325b2 service nova] Releasing lock "refresh_cache-f4820198-6af7-4434-b811-c208d50a5743" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:08:12.744380 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:13.099185 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:08:13.099787 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:08:13.100248 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Starting heal instance info cache {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9842}} Aug 30 14:08:13.188559 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Didn't find any instances for network info cache update. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9928}} Aug 30 14:08:13.189774 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:08:13.190621 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:08:13.191064 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:08:13.191465 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:08:13.191888 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:08:13.192933 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:08:13.193238 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=107505) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10461}} Aug 30 14:08:13.573606 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Rebuilding instance Aug 30 14:08:13.706289 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:13.854063 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:14.013430 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:08:14.140543 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:14.164844 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:14.333047 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Instance destroyed successfully. Aug 30 14:08:14.347976 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-07d77ec4-0f77-45f2-a5d4-a1b0fc6192af req-866ac3a9-51c3-4a82-8975-931cf9fb016a service nova] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Received event network-vif-unplugged-614532f0-0086-4382-b9bd-08c309fc6548 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:08:14.348720 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-07d77ec4-0f77-45f2-a5d4-a1b0fc6192af req-866ac3a9-51c3-4a82-8975-931cf9fb016a service nova] Acquiring lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:08:14.349469 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-07d77ec4-0f77-45f2-a5d4-a1b0fc6192af req-866ac3a9-51c3-4a82-8975-931cf9fb016a service nova] Lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:08:14.350152 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-07d77ec4-0f77-45f2-a5d4-a1b0fc6192af req-866ac3a9-51c3-4a82-8975-931cf9fb016a service nova] Lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:08:14.350460 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-07d77ec4-0f77-45f2-a5d4-a1b0fc6192af req-866ac3a9-51c3-4a82-8975-931cf9fb016a service nova] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] No waiting events found dispatching network-vif-unplugged-614532f0-0086-4382-b9bd-08c309fc6548 {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:08:14.350817 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [req-07d77ec4-0f77-45f2-a5d4-a1b0fc6192af req-866ac3a9-51c3-4a82-8975-931cf9fb016a service nova] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Received unexpected event network-vif-unplugged-614532f0-0086-4382-b9bd-08c309fc6548 for instance with vm_state active and task_state rebuilding. Aug 30 14:08:14.352995 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:14.359776 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:14.365880 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Instance destroyed successfully. Aug 30 14:08:14.367429 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=1,config_drive='True',created_at=2023-08-30T14:07:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1551096224',display_name='tempest-ServersAdminTestJSON-server-1551096224',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-serversadmintestjson-server-1551096224',id=7,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-08-30T14:08:10Z,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b756ee34b70d4189a05bf0a0cb751e60',ramdisk_id='',reservation_id='r-mntu6p6e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServersAdminTestJSON-499476127',owner_user_name='tempest-ServersAdminTestJSON-499476127-project-member'},tags=,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2023-08-30T14:08:13Z,user_data=None,user_id='5a6677b100d348b69575a22d363e18f4',uuid=3e4eb957-d13b-4d8c-be3b-bad1d61114fe,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "614532f0-0086-4382-b9bd-08c309fc6548", "address": "fa:16:3e:e5:63:35", "network": {"id": "089473b3-6053-4734-b86a-3935a1be492f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2061586110-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b756ee34b70d4189a05bf0a0cb751e60", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap614532f0-00", "ovs_interfaceid": "614532f0-0086-4382-b9bd-08c309fc6548", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Aug 30 14:08:14.367967 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Converting VIF {"id": "614532f0-0086-4382-b9bd-08c309fc6548", "address": "fa:16:3e:e5:63:35", "network": {"id": "089473b3-6053-4734-b86a-3935a1be492f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2061586110-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b756ee34b70d4189a05bf0a0cb751e60", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap614532f0-00", "ovs_interfaceid": "614532f0-0086-4382-b9bd-08c309fc6548", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:08:14.369059 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:63:35,bridge_name='br-int',has_traffic_filtering=True,id=614532f0-0086-4382-b9bd-08c309fc6548,network=Network(089473b3-6053-4734-b86a-3935a1be492f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap614532f0-00') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:08:14.369631 np0035104604 nova-compute[107505]: DEBUG os_vif [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:63:35,bridge_name='br-int',has_traffic_filtering=True,id=614532f0-0086-4382-b9bd-08c309fc6548,network=Network(089473b3-6053-4734-b86a-3935a1be492f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap614532f0-00') {{(pid=107505) unplug /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:109}} Aug 30 14:08:14.371831 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:14.372139 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap614532f0-00, bridge=br-int, if_exists=True) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:08:14.377854 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:08:14.380237 np0035104604 nova-compute[107505]: INFO os_vif [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:63:35,bridge_name='br-int',has_traffic_filtering=True,id=614532f0-0086-4382-b9bd-08c309fc6548,network=Network(089473b3-6053-4734-b86a-3935a1be492f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap614532f0-00') Aug 30 14:08:14.766326 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Deleting instance files /opt/stack/data/nova/instances/3e4eb957-d13b-4d8c-be3b-bad1d61114fe_del Aug 30 14:08:14.766976 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Deletion of /opt/stack/data/nova/instances/3e4eb957-d13b-4d8c-be3b-bad1d61114fe_del complete Aug 30 14:08:14.962174 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Creating instance directory {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4717}} Aug 30 14:08:14.963328 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Creating image(s) Aug 30 14:08:15.014275 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] rbd image 3e4eb957-d13b-4d8c-be3b-bad1d61114fe_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:08:15.046536 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] rbd image 3e4eb957-d13b-4d8c-be3b-bad1d61114fe_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:08:15.075275 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] rbd image 3e4eb957-d13b-4d8c-be3b-bad1d61114fe_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:08:15.078743 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e --force-share --output=json {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:08:15.178059 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CMD "/opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e --force-share --output=json" returned: 0 in 0.098s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:08:15.179437 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Acquiring lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:08:15.181093 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.002s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:08:15.182410 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.002s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:08:15.220041 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] rbd image 3e4eb957-d13b-4d8c-be3b-bad1d61114fe_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:08:15.224460 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e 3e4eb957-d13b-4d8c-be3b-bad1d61114fe_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:08:15.575127 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CMD "rbd import --pool vms /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e 3e4eb957-d13b-4d8c-be3b-bad1d61114fe_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.351s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:08:15.656887 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] resizing rbd image 3e4eb957-d13b-4d8c-be3b-bad1d61114fe_disk to 1073741824 {{(pid=107505) resize /opt/stack/nova/nova/storage/rbd_utils.py:288}} Aug 30 14:08:15.751487 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Created local disks {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4851}} Aug 30 14:08:15.752209 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Ensure instance console log exists: /opt/stack/data/nova/instances/3e4eb957-d13b-4d8c-be3b-bad1d61114fe/console.log {{(pid=107505) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4603}} Aug 30 14:08:15.752966 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:08:15.753466 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:08:15.754303 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:08:15.757589 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Start _get_guest_xml network_info=[{"id": "614532f0-0086-4382-b9bd-08c309fc6548", "address": "fa:16:3e:e5:63:35", "network": {"id": "089473b3-6053-4734-b86a-3935a1be492f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2061586110-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b756ee34b70d4189a05bf0a0cb751e60", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap614532f0-00", "ovs_interfaceid": "614532f0-0086-4382-b9bd-08c309fc6548", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'ide', 'dev': 'hda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-08-30T14:01:32Z,direct_url=,disk_format='qcow2',id=a32d892d-d3ba-4b7e-9bab-7e06f730b9e0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='7f1ed4be4a61468fa9c662d2b5fbcb56',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-08-30T14:01:43Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'disk_bus': 'virtio', 'image_id': 'a32d892d-d3ba-4b7e-9bab-7e06f730b9e0'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=107505) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7553}} Aug 30 14:08:15.764580 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError Aug 30 14:08:15.769115 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Searching host: 'np0035104604' for CPU controller through CGroups V1... {{(pid=107505) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1650}} Aug 30 14:08:15.770300 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CPU controller missing on host. {{(pid=107505) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1660}} Aug 30 14:08:15.780932 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Searching host: 'np0035104604' for CPU controller through CGroups V2... {{(pid=107505) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1669}} Aug 30 14:08:15.781451 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CPU controller found on host. {{(pid=107505) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1676}} Aug 30 14:08:15.784523 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=107505) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5390}} Aug 30 14:08:15.787138 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-08-30T14:02:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-08-30T14:01:32Z,direct_url=,disk_format='qcow2',id=a32d892d-d3ba-4b7e-9bab-7e06f730b9e0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='7f1ed4be4a61468fa9c662d2b5fbcb56',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-08-30T14:01:43Z,virtual_size=,visibility=), allow threads: True {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Aug 30 14:08:15.787838 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Flavor limits 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Aug 30 14:08:15.788509 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Image limits 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Aug 30 14:08:15.790007 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Flavor pref 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Aug 30 14:08:15.790007 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Image pref 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Aug 30 14:08:15.790007 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Aug 30 14:08:15.790756 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Aug 30 14:08:15.791232 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=107505) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Aug 30 14:08:15.791832 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Got 1 possible topologies {{(pid=107505) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Aug 30 14:08:15.792492 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Aug 30 14:08:15.793363 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Aug 30 14:08:15.794086 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lazy-loading 'vcpu_model' on Instance uuid 3e4eb957-d13b-4d8c-be3b-bad1d61114fe {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:08:15.825022 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:08:16.423227 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-9d93743a-3722-4578-a424-b18980e934cb req-30677c49-a0b9-4eb0-8d21-c3c86ba77f90 service nova] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Received event network-vif-plugged-614532f0-0086-4382-b9bd-08c309fc6548 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:08:16.423882 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-9d93743a-3722-4578-a424-b18980e934cb req-30677c49-a0b9-4eb0-8d21-c3c86ba77f90 service nova] Acquiring lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:08:16.425962 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-9d93743a-3722-4578-a424-b18980e934cb req-30677c49-a0b9-4eb0-8d21-c3c86ba77f90 service nova] Lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.002s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:08:16.426504 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-9d93743a-3722-4578-a424-b18980e934cb req-30677c49-a0b9-4eb0-8d21-c3c86ba77f90 service nova] Lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:08:16.427024 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-9d93743a-3722-4578-a424-b18980e934cb req-30677c49-a0b9-4eb0-8d21-c3c86ba77f90 service nova] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] No waiting events found dispatching network-vif-plugged-614532f0-0086-4382-b9bd-08c309fc6548 {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:08:16.427554 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [req-9d93743a-3722-4578-a424-b18980e934cb req-30677c49-a0b9-4eb0-8d21-c3c86ba77f90 service nova] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Received unexpected event network-vif-plugged-614532f0-0086-4382-b9bd-08c309fc6548 for instance with vm_state active and task_state rebuild_spawning. Aug 30 14:08:16.658385 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.834s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:08:16.692216 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] rbd image 3e4eb957-d13b-4d8c-be3b-bad1d61114fe_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:08:16.697559 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:08:17.412170 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.714s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:08:17.414568 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=1,config_drive='True',created_at=2023-08-30T14:07:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1551096224',display_name='tempest-ServersAdminTestJSON-server-1551096224',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-serversadmintestjson-server-1551096224',id=7,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-08-30T14:08:10Z,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b756ee34b70d4189a05bf0a0cb751e60',ramdisk_id='',reservation_id='r-mntu6p6e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='2',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServersAdminTestJSON-499476127',owner_user_name='tempest-ServersAdminTestJSON-499476127-project-member'},tags=,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2023-08-30T14:08:15Z,user_data=None,user_id='5a6677b100d348b69575a22d363e18f4',uuid=3e4eb957-d13b-4d8c-be3b-bad1d61114fe,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "614532f0-0086-4382-b9bd-08c309fc6548", "address": "fa:16:3e:e5:63:35", "network": {"id": "089473b3-6053-4734-b86a-3935a1be492f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2061586110-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b756ee34b70d4189a05bf0a0cb751e60", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap614532f0-00", "ovs_interfaceid": "614532f0-0086-4382-b9bd-08c309fc6548", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=qemu {{(pid=107505) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Aug 30 14:08:17.415063 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Converting VIF {"id": "614532f0-0086-4382-b9bd-08c309fc6548", "address": "fa:16:3e:e5:63:35", "network": {"id": "089473b3-6053-4734-b86a-3935a1be492f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2061586110-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b756ee34b70d4189a05bf0a0cb751e60", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap614532f0-00", "ovs_interfaceid": "614532f0-0086-4382-b9bd-08c309fc6548", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:08:17.416550 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:63:35,bridge_name='br-int',has_traffic_filtering=True,id=614532f0-0086-4382-b9bd-08c309fc6548,network=Network(089473b3-6053-4734-b86a-3935a1be492f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap614532f0-00') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] End _get_guest_xml xml= Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: instance-00000007 Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: 131072 Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: 1 Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: tempest-ServersAdminTestJSON-server-1551096224 Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: 2023-08-30 14:08:15 Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: 128 Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: 1 Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: 0 Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: 0 Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: 1 Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: tempest-ServersAdminTestJSON-499476127-project-member Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: tempest-ServersAdminTestJSON-499476127 Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: OpenStack Foundation Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: OpenStack Nova Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: 27.1.0 Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Virtual Machine Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: hvm Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Nehalem Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: /dev/urandom Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: Aug 30 14:08:17.420834 np0035104604 nova-compute[107505]: {{(pid=107505) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7559}} Aug 30 14:08:17.431340 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Preparing to wait for external event network-vif-plugged-614532f0-0086-4382-b9bd-08c309fc6548 {{(pid=107505) prepare_for_instance_event /opt/stack/nova/nova/compute/manager.py:283}} Aug 30 14:08:17.431340 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Acquiring lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:08:17.431340 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: waited 0.002s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:08:17.431340 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: held 0.002s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:08:17.431340 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=1,config_drive='True',created_at=2023-08-30T14:07:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1551096224',display_name='tempest-ServersAdminTestJSON-server-1551096224',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-serversadmintestjson-server-1551096224',id=7,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-08-30T14:08:10Z,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b756ee34b70d4189a05bf0a0cb751e60',ramdisk_id='',reservation_id='r-mntu6p6e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='2',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServersAdminTestJSON-499476127',owner_user_name='tempest-ServersAdminTestJSON-499476127-project-member'},tags=,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2023-08-30T14:08:15Z,user_data=None,user_id='5a6677b100d348b69575a22d363e18f4',uuid=3e4eb957-d13b-4d8c-be3b-bad1d61114fe,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "614532f0-0086-4382-b9bd-08c309fc6548", "address": "fa:16:3e:e5:63:35", "network": {"id": "089473b3-6053-4734-b86a-3935a1be492f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2061586110-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b756ee34b70d4189a05bf0a0cb751e60", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap614532f0-00", "ovs_interfaceid": "614532f0-0086-4382-b9bd-08c309fc6548", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Aug 30 14:08:17.432383 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Converting VIF {"id": "614532f0-0086-4382-b9bd-08c309fc6548", "address": "fa:16:3e:e5:63:35", "network": {"id": "089473b3-6053-4734-b86a-3935a1be492f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2061586110-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b756ee34b70d4189a05bf0a0cb751e60", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap614532f0-00", "ovs_interfaceid": "614532f0-0086-4382-b9bd-08c309fc6548", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:08:17.433286 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:63:35,bridge_name='br-int',has_traffic_filtering=True,id=614532f0-0086-4382-b9bd-08c309fc6548,network=Network(089473b3-6053-4734-b86a-3935a1be492f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap614532f0-00') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:08:17.434253 np0035104604 nova-compute[107505]: DEBUG os_vif [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:63:35,bridge_name='br-int',has_traffic_filtering=True,id=614532f0-0086-4382-b9bd-08c309fc6548,network=Network(089473b3-6053-4734-b86a-3935a1be492f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap614532f0-00') {{(pid=107505) plug /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:76}} Aug 30 14:08:17.435311 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:17.436446 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:08:17.437204 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Aug 30 14:08:17.445082 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:17.445791 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap614532f0-00, may_exist=True, interface_attrs={}) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:08:17.447127 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap614532f0-00, col_values=(('external_ids', {'iface-id': '614532f0-0086-4382-b9bd-08c309fc6548', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e5:63:35', 'vm-uuid': '3e4eb957-d13b-4d8c-be3b-bad1d61114fe'}),), if_exists=True) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:08:17.457216 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:17.467208 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:08:17.469034 np0035104604 nova-compute[107505]: INFO os_vif [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:63:35,bridge_name='br-int',has_traffic_filtering=True,id=614532f0-0086-4382-b9bd-08c309fc6548,network=Network(089473b3-6053-4734-b86a-3935a1be492f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap614532f0-00') Aug 30 14:08:17.515323 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] No BDM found with device name vda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:08:17.516111 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] No BDM found with device name hda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:08:17.516576 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] No VIF found with MAC fa:16:3e:e5:63:35, not building metadata {{(pid=107505) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12068}} Aug 30 14:08:17.518295 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Using config drive Aug 30 14:08:17.547095 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] rbd image 3e4eb957-d13b-4d8c-be3b-bad1d61114fe_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:08:17.564821 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lazy-loading 'ec2_ids' on Instance uuid 3e4eb957-d13b-4d8c-be3b-bad1d61114fe {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:08:17.638934 np0035104604 nova-compute[107505]: DEBUG nova.objects.base [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Object Instance<3e4eb957-d13b-4d8c-be3b-bad1d61114fe> lazy-loaded attributes: vcpu_model,ec2_ids {{(pid=107505) wrapper /opt/stack/nova/nova/objects/base.py:126}} Aug 30 14:08:17.639399 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lazy-loading 'keypairs' on Instance uuid 3e4eb957-d13b-4d8c-be3b-bad1d61114fe {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:08:17.696393 np0035104604 nova-compute[107505]: DEBUG nova.objects.base [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Object Instance<3e4eb957-d13b-4d8c-be3b-bad1d61114fe> lazy-loaded attributes: vcpu_model,ec2_ids,keypairs {{(pid=107505) wrapper /opt/stack/nova/nova/objects/base.py:126}} Aug 30 14:08:18.091689 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Creating config drive at /opt/stack/data/nova/instances/3e4eb957-d13b-4d8c-be3b-bad1d61114fe/disk.config Aug 30 14:08:18.097858 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Running cmd (subprocess): genisoimage -o /opt/stack/data/nova/instances/3e4eb957-d13b-4d8c-be3b-bad1d61114fe/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 27.1.0 -quiet -J -r -V config-2 /tmp/tmphl_vpxg1 {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:08:18.147187 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CMD "genisoimage -o /opt/stack/data/nova/instances/3e4eb957-d13b-4d8c-be3b-bad1d61114fe/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 27.1.0 -quiet -J -r -V config-2 /tmp/tmphl_vpxg1" returned: 0 in 0.049s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:08:18.189816 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] rbd image 3e4eb957-d13b-4d8c-be3b-bad1d61114fe_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:08:18.194680 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/3e4eb957-d13b-4d8c-be3b-bad1d61114fe/disk.config 3e4eb957-d13b-4d8c-be3b-bad1d61114fe_disk.config --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:08:18.363284 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CMD "rbd import --pool vms /opt/stack/data/nova/instances/3e4eb957-d13b-4d8c-be3b-bad1d61114fe/disk.config 3e4eb957-d13b-4d8c-be3b-bad1d61114fe_disk.config --image-format=2 --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.169s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:08:18.363948 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Deleting local config drive /opt/stack/data/nova/instances/3e4eb957-d13b-4d8c-be3b-bad1d61114fe/disk.config because it was imported into RBD. Aug 30 14:08:18.391729 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:18.403043 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:18.407552 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:18.572153 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:18.668095 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:18.884510 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-15bc6ad0-21e9-4bc7-8b93-14cb5830716c req-0740f261-b88a-4f30-bed9-496f2f5acdfa service nova] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Received event network-vif-plugged-614532f0-0086-4382-b9bd-08c309fc6548 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:08:18.885119 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-15bc6ad0-21e9-4bc7-8b93-14cb5830716c req-0740f261-b88a-4f30-bed9-496f2f5acdfa service nova] Acquiring lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:08:18.885491 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-15bc6ad0-21e9-4bc7-8b93-14cb5830716c req-0740f261-b88a-4f30-bed9-496f2f5acdfa service nova] Lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:08:18.885931 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-15bc6ad0-21e9-4bc7-8b93-14cb5830716c req-0740f261-b88a-4f30-bed9-496f2f5acdfa service nova] Lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:08:18.886349 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-15bc6ad0-21e9-4bc7-8b93-14cb5830716c req-0740f261-b88a-4f30-bed9-496f2f5acdfa service nova] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Processing event network-vif-plugged-614532f0-0086-4382-b9bd-08c309fc6548 {{(pid=107505) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10792}} Aug 30 14:08:19.090876 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:19.430460 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Removed pending event for 3e4eb957-d13b-4d8c-be3b-bad1d61114fe due to event {{(pid=107505) _event_emit_delayed /opt/stack/nova/nova/virt/libvirt/host.py:438}} Aug 30 14:08:19.470954 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Started> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:08:19.470954 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] VM Started (Lifecycle Event) Aug 30 14:08:19.470954 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Instance event wait completed in 0 seconds for network-vif-plugged {{(pid=107505) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Aug 30 14:08:19.470954 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Guest created on hypervisor {{(pid=107505) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4411}} Aug 30 14:08:19.470954 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Instance spawned successfully. Aug 30 14:08:19.470954 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:894}} Aug 30 14:08:19.470954 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:08:19.475781 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:08:19.483890 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Found default for hw_cdrom_bus of ide {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:08:19.484570 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Found default for hw_disk_bus of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:08:19.485397 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Found default for hw_input_bus of None {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:08:19.487739 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Found default for hw_pointer_model of None {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:08:19.488552 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Found default for hw_video_model of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:08:19.489810 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Found default for hw_vif_model of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:08:19.498019 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] During sync_power_state the instance has a pending task (rebuild_spawning). Skip. Aug 30 14:08:19.498716 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Paused> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:08:19.499121 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] VM Paused (Lifecycle Event) Aug 30 14:08:19.519314 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:08:19.527383 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Resumed> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:08:19.528086 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] VM Resumed (Lifecycle Event) Aug 30 14:08:19.548758 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:08:19.599217 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:08:19.626385 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] During sync_power_state the instance has a pending task (rebuild_spawning). Skip. Aug 30 14:08:19.739763 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:08:19.847673 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:08:19.848687 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:08:19.849579 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Trying to apply a migration context that does not seem to be set for this instance {{(pid=107505) apply_migration_context /opt/stack/nova/nova/objects/instance.py:1078}} Aug 30 14:08:19.961907 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-d4cb1ac3-06e2-4856-ba41-aeb5f7b5da82 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.113s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:08:20.261065 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:21.031388 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-1212e63d-6381-42d4-b88a-6f2cee3dabfa req-5873ce61-435f-42c0-b077-3d66c0ac2fd9 service nova] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Received event network-vif-plugged-614532f0-0086-4382-b9bd-08c309fc6548 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:08:21.031858 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-1212e63d-6381-42d4-b88a-6f2cee3dabfa req-5873ce61-435f-42c0-b077-3d66c0ac2fd9 service nova] Acquiring lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:08:21.032481 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-1212e63d-6381-42d4-b88a-6f2cee3dabfa req-5873ce61-435f-42c0-b077-3d66c0ac2fd9 service nova] Lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:08:21.032789 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-1212e63d-6381-42d4-b88a-6f2cee3dabfa req-5873ce61-435f-42c0-b077-3d66c0ac2fd9 service nova] Lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:08:21.033286 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-1212e63d-6381-42d4-b88a-6f2cee3dabfa req-5873ce61-435f-42c0-b077-3d66c0ac2fd9 service nova] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] No waiting events found dispatching network-vif-plugged-614532f0-0086-4382-b9bd-08c309fc6548 {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:08:21.033710 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [req-1212e63d-6381-42d4-b88a-6f2cee3dabfa req-5873ce61-435f-42c0-b077-3d66c0ac2fd9 service nova] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Received unexpected event network-vif-plugged-614532f0-0086-4382-b9bd-08c309fc6548 for instance with vm_state active and task_state None. Aug 30 14:08:22.450669 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:23.198809 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:23.260728 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-678a9a1a-f07f-43ec-ad9f-0dd39f94468a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Acquiring lock "4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:08:23.261626 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-678a9a1a-f07f-43ec-ad9f-0dd39f94468a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:08:23.262356 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-678a9a1a-f07f-43ec-ad9f-0dd39f94468a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Acquiring lock "4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:08:23.263191 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-678a9a1a-f07f-43ec-ad9f-0dd39f94468a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:08:23.263647 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-678a9a1a-f07f-43ec-ad9f-0dd39f94468a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:08:23.268312 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-678a9a1a-f07f-43ec-ad9f-0dd39f94468a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Terminating instance Aug 30 14:08:23.271420 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-678a9a1a-f07f-43ec-ad9f-0dd39f94468a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Start destroying the instance on the hypervisor. {{(pid=107505) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} Aug 30 14:08:23.352664 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:23.378910 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:23.551496 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:23.553165 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:23.670449 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:23.684831 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:23.693706 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:23.725148 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Instance destroyed successfully. Aug 30 14:08:23.726216 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-678a9a1a-f07f-43ec-ad9f-0dd39f94468a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lazy-loading 'resources' on Instance uuid 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:08:23.739496 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-678a9a1a-f07f-43ec-ad9f-0dd39f94468a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2023-08-30T14:07:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1373465328',display_name='tempest-ServersAdminTestJSON-server-1373465328',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-serversadmintestjson-server-1373465328',id=9,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-08-30T14:07:54Z,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='b756ee34b70d4189a05bf0a0cb751e60',ramdisk_id='',reservation_id='r-5xdqendf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServersAdminTestJSON-499476127',owner_user_name='tempest-ServersAdminTestJSON-499476127-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-08-30T14:07:54Z,user_data=None,user_id='5a6677b100d348b69575a22d363e18f4',uuid=4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3c99b14a-715d-489f-9459-82876d9ca624", "address": "fa:16:3e:11:c2:65", "network": {"id": "089473b3-6053-4734-b86a-3935a1be492f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2061586110-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b756ee34b70d4189a05bf0a0cb751e60", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c99b14a-71", "ovs_interfaceid": "3c99b14a-715d-489f-9459-82876d9ca624", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Aug 30 14:08:23.740245 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-678a9a1a-f07f-43ec-ad9f-0dd39f94468a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Converting VIF {"id": "3c99b14a-715d-489f-9459-82876d9ca624", "address": "fa:16:3e:11:c2:65", "network": {"id": "089473b3-6053-4734-b86a-3935a1be492f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2061586110-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b756ee34b70d4189a05bf0a0cb751e60", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c99b14a-71", "ovs_interfaceid": "3c99b14a-715d-489f-9459-82876d9ca624", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:08:23.741637 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-678a9a1a-f07f-43ec-ad9f-0dd39f94468a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:11:c2:65,bridge_name='br-int',has_traffic_filtering=True,id=3c99b14a-715d-489f-9459-82876d9ca624,network=Network(089473b3-6053-4734-b86a-3935a1be492f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c99b14a-71') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:08:23.742734 np0035104604 nova-compute[107505]: DEBUG os_vif [None req-678a9a1a-f07f-43ec-ad9f-0dd39f94468a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:11:c2:65,bridge_name='br-int',has_traffic_filtering=True,id=3c99b14a-715d-489f-9459-82876d9ca624,network=Network(089473b3-6053-4734-b86a-3935a1be492f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c99b14a-71') {{(pid=107505) unplug /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:109}} Aug 30 14:08:23.745935 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:23.746516 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3c99b14a-71, bridge=br-int, if_exists=True) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:08:23.754474 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:08:23.759995 np0035104604 nova-compute[107505]: INFO os_vif [None req-678a9a1a-f07f-43ec-ad9f-0dd39f94468a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:11:c2:65,bridge_name='br-int',has_traffic_filtering=True,id=3c99b14a-715d-489f-9459-82876d9ca624,network=Network(089473b3-6053-4734-b86a-3935a1be492f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c99b14a-71') Aug 30 14:08:23.961106 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-5c5b80ba-8539-4bf3-937d-468055397e85 req-d25e07d1-03c5-4ef9-a253-f54f81745aa3 service nova] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Received event network-vif-unplugged-3c99b14a-715d-489f-9459-82876d9ca624 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:08:23.961700 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-5c5b80ba-8539-4bf3-937d-468055397e85 req-d25e07d1-03c5-4ef9-a253-f54f81745aa3 service nova] Acquiring lock "4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:08:23.962816 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-5c5b80ba-8539-4bf3-937d-468055397e85 req-d25e07d1-03c5-4ef9-a253-f54f81745aa3 service nova] Lock "4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:08:23.963218 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-5c5b80ba-8539-4bf3-937d-468055397e85 req-d25e07d1-03c5-4ef9-a253-f54f81745aa3 service nova] Lock "4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:08:23.963619 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-5c5b80ba-8539-4bf3-937d-468055397e85 req-d25e07d1-03c5-4ef9-a253-f54f81745aa3 service nova] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] No waiting events found dispatching network-vif-unplugged-3c99b14a-715d-489f-9459-82876d9ca624 {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:08:23.964606 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-5c5b80ba-8539-4bf3-937d-468055397e85 req-d25e07d1-03c5-4ef9-a253-f54f81745aa3 service nova] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Received event network-vif-unplugged-3c99b14a-715d-489f-9459-82876d9ca624 for instance with task_state deleting. {{(pid=107505) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10810}} Aug 30 14:08:24.094519 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-678a9a1a-f07f-43ec-ad9f-0dd39f94468a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Deleting instance files /opt/stack/data/nova/instances/4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc_del Aug 30 14:08:24.095490 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-678a9a1a-f07f-43ec-ad9f-0dd39f94468a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Deletion of /opt/stack/data/nova/instances/4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc_del complete Aug 30 14:08:24.184382 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-678a9a1a-f07f-43ec-ad9f-0dd39f94468a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Took 0.91 seconds to destroy the instance on the hypervisor. Aug 30 14:08:24.185031 np0035104604 nova-compute[107505]: DEBUG oslo.service.loopingcall [None req-678a9a1a-f07f-43ec-ad9f-0dd39f94468a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=107505) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} Aug 30 14:08:24.185481 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [-] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Deallocating network for instance {{(pid=107505) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} Aug 30 14:08:24.185817 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [-] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] deallocate_for_instance() {{(pid=107505) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} Aug 30 14:08:25.148780 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [-] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Updating instance_info_cache with network_info: [] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:08:25.168232 np0035104604 nova-compute[107505]: INFO nova.compute.manager [-] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Took 0.98 seconds to deallocate network for instance. Aug 30 14:08:25.227773 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-678a9a1a-f07f-43ec-ad9f-0dd39f94468a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:08:25.228182 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-678a9a1a-f07f-43ec-ad9f-0dd39f94468a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:08:26.045604 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-9c34a309-560b-4d39-9ac8-1d27387c1277 req-67255c68-6379-4c81-a071-ebdf994c62a4 service nova] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Received event network-vif-plugged-3c99b14a-715d-489f-9459-82876d9ca624 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:08:26.046059 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-9c34a309-560b-4d39-9ac8-1d27387c1277 req-67255c68-6379-4c81-a071-ebdf994c62a4 service nova] Acquiring lock "4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:08:26.046437 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-9c34a309-560b-4d39-9ac8-1d27387c1277 req-67255c68-6379-4c81-a071-ebdf994c62a4 service nova] Lock "4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:08:26.046809 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-9c34a309-560b-4d39-9ac8-1d27387c1277 req-67255c68-6379-4c81-a071-ebdf994c62a4 service nova] Lock "4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:08:26.047177 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-9c34a309-560b-4d39-9ac8-1d27387c1277 req-67255c68-6379-4c81-a071-ebdf994c62a4 service nova] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] No waiting events found dispatching network-vif-plugged-3c99b14a-715d-489f-9459-82876d9ca624 {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:08:26.047540 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [req-9c34a309-560b-4d39-9ac8-1d27387c1277 req-67255c68-6379-4c81-a071-ebdf994c62a4 service nova] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Received unexpected event network-vif-plugged-3c99b14a-715d-489f-9459-82876d9ca624 for instance with vm_state deleted and task_state None. Aug 30 14:08:26.047952 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-9c34a309-560b-4d39-9ac8-1d27387c1277 req-67255c68-6379-4c81-a071-ebdf994c62a4 service nova] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Received event network-vif-deleted-3c99b14a-715d-489f-9459-82876d9ca624 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:08:26.069845 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-678a9a1a-f07f-43ec-ad9f-0dd39f94468a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:08:26.774525 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-678a9a1a-f07f-43ec-ad9f-0dd39f94468a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.703s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:08:26.782917 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-678a9a1a-f07f-43ec-ad9f-0dd39f94468a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:08:26.803793 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-678a9a1a-f07f-43ec-ad9f-0dd39f94468a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:08:26.947329 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-678a9a1a-f07f-43ec-ad9f-0dd39f94468a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.719s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:08:27.145359 np0035104604 nova-compute[107505]: INFO nova.scheduler.client.report [None req-678a9a1a-f07f-43ec-ad9f-0dd39f94468a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Deleted allocations for instance 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc Aug 30 14:08:27.241022 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-678a9a1a-f07f-43ec-ad9f-0dd39f94468a tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 3.979s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:08:27.749602 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-4741b47b-4b51-4cc9-b7dc-6356ef095e53 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Acquiring lock "8f0941f9-96cc-445a-9d3b-83aa415ff950" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:08:27.750689 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-4741b47b-4b51-4cc9-b7dc-6356ef095e53 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "8f0941f9-96cc-445a-9d3b-83aa415ff950" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:08:27.751180 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-4741b47b-4b51-4cc9-b7dc-6356ef095e53 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Acquiring lock "8f0941f9-96cc-445a-9d3b-83aa415ff950-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:08:27.751486 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-4741b47b-4b51-4cc9-b7dc-6356ef095e53 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "8f0941f9-96cc-445a-9d3b-83aa415ff950-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:08:27.752268 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-4741b47b-4b51-4cc9-b7dc-6356ef095e53 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "8f0941f9-96cc-445a-9d3b-83aa415ff950-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:08:27.756690 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-4741b47b-4b51-4cc9-b7dc-6356ef095e53 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Terminating instance Aug 30 14:08:27.760961 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-4741b47b-4b51-4cc9-b7dc-6356ef095e53 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Start destroying the instance on the hypervisor. {{(pid=107505) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} Aug 30 14:08:27.875204 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:27.904541 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:28.060475 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:28.072440 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:28.175874 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:28.186474 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:28.208319 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Instance destroyed successfully. Aug 30 14:08:28.209366 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-4741b47b-4b51-4cc9-b7dc-6356ef095e53 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lazy-loading 'resources' on Instance uuid 8f0941f9-96cc-445a-9d3b-83aa415ff950 {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:08:28.224248 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-4741b47b-4b51-4cc9-b7dc-6356ef095e53 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2023-08-30T14:07:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-566464463',display_name='tempest-ServersAdminTestJSON-server-566464463',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-serversadmintestjson-server-566464463',id=8,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-08-30T14:07:37Z,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='b756ee34b70d4189a05bf0a0cb751e60',ramdisk_id='',reservation_id='r-g8rakioj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServersAdminTestJSON-499476127',owner_user_name='tempest-ServersAdminTestJSON-499476127-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-08-30T14:07:38Z,user_data=None,user_id='5a6677b100d348b69575a22d363e18f4',uuid=8f0941f9-96cc-445a-9d3b-83aa415ff950,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3195884f-aa65-4b41-af9d-9f8d326dbc95", "address": "fa:16:3e:41:92:45", "network": {"id": "089473b3-6053-4734-b86a-3935a1be492f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2061586110-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b756ee34b70d4189a05bf0a0cb751e60", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3195884f-aa", "ovs_interfaceid": "3195884f-aa65-4b41-af9d-9f8d326dbc95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Aug 30 14:08:28.224832 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-4741b47b-4b51-4cc9-b7dc-6356ef095e53 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Converting VIF {"id": "3195884f-aa65-4b41-af9d-9f8d326dbc95", "address": "fa:16:3e:41:92:45", "network": {"id": "089473b3-6053-4734-b86a-3935a1be492f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2061586110-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b756ee34b70d4189a05bf0a0cb751e60", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3195884f-aa", "ovs_interfaceid": "3195884f-aa65-4b41-af9d-9f8d326dbc95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:08:28.226636 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-4741b47b-4b51-4cc9-b7dc-6356ef095e53 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:41:92:45,bridge_name='br-int',has_traffic_filtering=True,id=3195884f-aa65-4b41-af9d-9f8d326dbc95,network=Network(089473b3-6053-4734-b86a-3935a1be492f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3195884f-aa') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:08:28.229455 np0035104604 nova-compute[107505]: DEBUG os_vif [None req-4741b47b-4b51-4cc9-b7dc-6356ef095e53 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:41:92:45,bridge_name='br-int',has_traffic_filtering=True,id=3195884f-aa65-4b41-af9d-9f8d326dbc95,network=Network(089473b3-6053-4734-b86a-3935a1be492f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3195884f-aa') {{(pid=107505) unplug /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:109}} Aug 30 14:08:28.234200 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:28.234940 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3195884f-aa, bridge=br-int, if_exists=True) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:08:28.243408 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:08:28.247680 np0035104604 nova-compute[107505]: INFO os_vif [None req-4741b47b-4b51-4cc9-b7dc-6356ef095e53 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:41:92:45,bridge_name='br-int',has_traffic_filtering=True,id=3195884f-aa65-4b41-af9d-9f8d326dbc95,network=Network(089473b3-6053-4734-b86a-3935a1be492f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3195884f-aa') Aug 30 14:08:28.344085 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-eba99038-f24b-496a-908b-930e1b220a7c req-e0f38f22-e832-4504-a130-299dad0d2671 service nova] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Received event network-vif-unplugged-3195884f-aa65-4b41-af9d-9f8d326dbc95 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:08:28.344706 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-eba99038-f24b-496a-908b-930e1b220a7c req-e0f38f22-e832-4504-a130-299dad0d2671 service nova] Acquiring lock "8f0941f9-96cc-445a-9d3b-83aa415ff950-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:08:28.344952 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-eba99038-f24b-496a-908b-930e1b220a7c req-e0f38f22-e832-4504-a130-299dad0d2671 service nova] Lock "8f0941f9-96cc-445a-9d3b-83aa415ff950-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:08:28.345348 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-eba99038-f24b-496a-908b-930e1b220a7c req-e0f38f22-e832-4504-a130-299dad0d2671 service nova] Lock "8f0941f9-96cc-445a-9d3b-83aa415ff950-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:08:28.345818 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-eba99038-f24b-496a-908b-930e1b220a7c req-e0f38f22-e832-4504-a130-299dad0d2671 service nova] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] No waiting events found dispatching network-vif-unplugged-3195884f-aa65-4b41-af9d-9f8d326dbc95 {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:08:28.346190 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-eba99038-f24b-496a-908b-930e1b220a7c req-e0f38f22-e832-4504-a130-299dad0d2671 service nova] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Received event network-vif-unplugged-3195884f-aa65-4b41-af9d-9f8d326dbc95 for instance with task_state deleting. {{(pid=107505) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10810}} Aug 30 14:08:28.616409 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-4741b47b-4b51-4cc9-b7dc-6356ef095e53 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Deleting instance files /opt/stack/data/nova/instances/8f0941f9-96cc-445a-9d3b-83aa415ff950_del Aug 30 14:08:28.617110 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-4741b47b-4b51-4cc9-b7dc-6356ef095e53 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Deletion of /opt/stack/data/nova/instances/8f0941f9-96cc-445a-9d3b-83aa415ff950_del complete Aug 30 14:08:28.677766 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:28.693344 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-4741b47b-4b51-4cc9-b7dc-6356ef095e53 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Took 0.93 seconds to destroy the instance on the hypervisor. Aug 30 14:08:28.694332 np0035104604 nova-compute[107505]: DEBUG oslo.service.loopingcall [None req-4741b47b-4b51-4cc9-b7dc-6356ef095e53 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=107505) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} Aug 30 14:08:28.695368 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [-] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Deallocating network for instance {{(pid=107505) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} Aug 30 14:08:28.695623 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [-] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] deallocate_for_instance() {{(pid=107505) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} Aug 30 14:08:29.669148 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [-] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Updating instance_info_cache with network_info: [] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:08:29.701297 np0035104604 nova-compute[107505]: INFO nova.compute.manager [-] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Took 1.01 seconds to deallocate network for instance. Aug 30 14:08:29.792562 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-4741b47b-4b51-4cc9-b7dc-6356ef095e53 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:08:29.793369 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-4741b47b-4b51-4cc9-b7dc-6356ef095e53 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:08:30.460625 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-d4c89a3a-55f5-4ddb-babb-e709174bd1c7 req-29428ee0-689e-4aee-aab5-0ef4522a29fb service nova] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Received event network-vif-plugged-3195884f-aa65-4b41-af9d-9f8d326dbc95 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:08:30.465063 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-d4c89a3a-55f5-4ddb-babb-e709174bd1c7 req-29428ee0-689e-4aee-aab5-0ef4522a29fb service nova] Acquiring lock "8f0941f9-96cc-445a-9d3b-83aa415ff950-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:08:30.465063 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-d4c89a3a-55f5-4ddb-babb-e709174bd1c7 req-29428ee0-689e-4aee-aab5-0ef4522a29fb service nova] Lock "8f0941f9-96cc-445a-9d3b-83aa415ff950-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:08:30.465063 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-d4c89a3a-55f5-4ddb-babb-e709174bd1c7 req-29428ee0-689e-4aee-aab5-0ef4522a29fb service nova] Lock "8f0941f9-96cc-445a-9d3b-83aa415ff950-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:08:30.465063 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-d4c89a3a-55f5-4ddb-babb-e709174bd1c7 req-29428ee0-689e-4aee-aab5-0ef4522a29fb service nova] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] No waiting events found dispatching network-vif-plugged-3195884f-aa65-4b41-af9d-9f8d326dbc95 {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:08:30.465063 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [req-d4c89a3a-55f5-4ddb-babb-e709174bd1c7 req-29428ee0-689e-4aee-aab5-0ef4522a29fb service nova] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Received unexpected event network-vif-plugged-3195884f-aa65-4b41-af9d-9f8d326dbc95 for instance with vm_state deleted and task_state None. Aug 30 14:08:30.465063 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-d4c89a3a-55f5-4ddb-babb-e709174bd1c7 req-29428ee0-689e-4aee-aab5-0ef4522a29fb service nova] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Received event network-vif-deleted-3195884f-aa65-4b41-af9d-9f8d326dbc95 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:08:30.510255 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-4741b47b-4b51-4cc9-b7dc-6356ef095e53 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:08:31.360000 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-4741b47b-4b51-4cc9-b7dc-6356ef095e53 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.851s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:08:31.372513 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-4741b47b-4b51-4cc9-b7dc-6356ef095e53 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:08:31.414762 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-4741b47b-4b51-4cc9-b7dc-6356ef095e53 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:08:31.460704 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-4741b47b-4b51-4cc9-b7dc-6356ef095e53 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.667s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:08:31.692220 np0035104604 nova-compute[107505]: INFO nova.scheduler.client.report [None req-4741b47b-4b51-4cc9-b7dc-6356ef095e53 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Deleted allocations for instance 8f0941f9-96cc-445a-9d3b-83aa415ff950 Aug 30 14:08:31.838289 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-4741b47b-4b51-4cc9-b7dc-6356ef095e53 tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "8f0941f9-96cc-445a-9d3b-83aa415ff950" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 4.087s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:08:32.198267 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-dff6a006-c42f-43fd-aabe-c584fbd7adab tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Acquiring lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:08:32.198920 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-dff6a006-c42f-43fd-aabe-c584fbd7adab tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:08:32.199453 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-dff6a006-c42f-43fd-aabe-c584fbd7adab tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Acquiring lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:08:32.199876 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-dff6a006-c42f-43fd-aabe-c584fbd7adab tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:08:32.200517 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-dff6a006-c42f-43fd-aabe-c584fbd7adab tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:08:32.220984 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-dff6a006-c42f-43fd-aabe-c584fbd7adab tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Terminating instance Aug 30 14:08:32.224921 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-dff6a006-c42f-43fd-aabe-c584fbd7adab tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Start destroying the instance on the hypervisor. {{(pid=107505) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} Aug 30 14:08:32.306793 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:32.330945 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:32.344858 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:32.372132 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:32.485597 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Instance destroyed successfully. Aug 30 14:08:32.486637 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-dff6a006-c42f-43fd-aabe-c584fbd7adab tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lazy-loading 'resources' on Instance uuid 3e4eb957-d13b-4d8c-be3b-bad1d61114fe {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:08:32.503618 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-dff6a006-c42f-43fd-aabe-c584fbd7adab tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=1,config_drive='True',created_at=2023-08-30T14:07:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1551096224',display_name='tempest-ServersAdminTestJSON-server-1551096224',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-serversadmintestjson-server-1551096224',id=7,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-08-30T14:08:19Z,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='b756ee34b70d4189a05bf0a0cb751e60',ramdisk_id='',reservation_id='r-mntu6p6e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='2',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServersAdminTestJSON-499476127',owner_user_name='tempest-ServersAdminTestJSON-499476127-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-08-30T14:08:23Z,user_data=None,user_id='5a6677b100d348b69575a22d363e18f4',uuid=3e4eb957-d13b-4d8c-be3b-bad1d61114fe,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "614532f0-0086-4382-b9bd-08c309fc6548", "address": "fa:16:3e:e5:63:35", "network": {"id": "089473b3-6053-4734-b86a-3935a1be492f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2061586110-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b756ee34b70d4189a05bf0a0cb751e60", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap614532f0-00", "ovs_interfaceid": "614532f0-0086-4382-b9bd-08c309fc6548", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Aug 30 14:08:32.504052 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-dff6a006-c42f-43fd-aabe-c584fbd7adab tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Converting VIF {"id": "614532f0-0086-4382-b9bd-08c309fc6548", "address": "fa:16:3e:e5:63:35", "network": {"id": "089473b3-6053-4734-b86a-3935a1be492f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-2061586110-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b756ee34b70d4189a05bf0a0cb751e60", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap614532f0-00", "ovs_interfaceid": "614532f0-0086-4382-b9bd-08c309fc6548", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:08:32.505461 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-dff6a006-c42f-43fd-aabe-c584fbd7adab tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:63:35,bridge_name='br-int',has_traffic_filtering=True,id=614532f0-0086-4382-b9bd-08c309fc6548,network=Network(089473b3-6053-4734-b86a-3935a1be492f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap614532f0-00') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:08:32.506128 np0035104604 nova-compute[107505]: DEBUG os_vif [None req-dff6a006-c42f-43fd-aabe-c584fbd7adab tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:63:35,bridge_name='br-int',has_traffic_filtering=True,id=614532f0-0086-4382-b9bd-08c309fc6548,network=Network(089473b3-6053-4734-b86a-3935a1be492f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap614532f0-00') {{(pid=107505) unplug /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:109}} Aug 30 14:08:32.510027 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:32.510509 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap614532f0-00, bridge=br-int, if_exists=True) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:08:32.514111 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:32.519100 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:08:32.523075 np0035104604 nova-compute[107505]: INFO os_vif [None req-dff6a006-c42f-43fd-aabe-c584fbd7adab tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:63:35,bridge_name='br-int',has_traffic_filtering=True,id=614532f0-0086-4382-b9bd-08c309fc6548,network=Network(089473b3-6053-4734-b86a-3935a1be492f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap614532f0-00') Aug 30 14:08:32.915208 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-ae6d76c4-398e-4129-8781-eec8f1d98c95 req-57fb55a3-d480-47f9-8534-291df87697bf service nova] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Received event network-vif-unplugged-614532f0-0086-4382-b9bd-08c309fc6548 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:08:32.915570 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-ae6d76c4-398e-4129-8781-eec8f1d98c95 req-57fb55a3-d480-47f9-8534-291df87697bf service nova] Acquiring lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:08:32.916577 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-ae6d76c4-398e-4129-8781-eec8f1d98c95 req-57fb55a3-d480-47f9-8534-291df87697bf service nova] Lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:08:32.917089 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-ae6d76c4-398e-4129-8781-eec8f1d98c95 req-57fb55a3-d480-47f9-8534-291df87697bf service nova] Lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:08:32.917607 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-ae6d76c4-398e-4129-8781-eec8f1d98c95 req-57fb55a3-d480-47f9-8534-291df87697bf service nova] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] No waiting events found dispatching network-vif-unplugged-614532f0-0086-4382-b9bd-08c309fc6548 {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:08:32.918148 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-ae6d76c4-398e-4129-8781-eec8f1d98c95 req-57fb55a3-d480-47f9-8534-291df87697bf service nova] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Received event network-vif-unplugged-614532f0-0086-4382-b9bd-08c309fc6548 for instance with task_state deleting. {{(pid=107505) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10810}} Aug 30 14:08:32.930575 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-dff6a006-c42f-43fd-aabe-c584fbd7adab tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Deleting instance files /opt/stack/data/nova/instances/3e4eb957-d13b-4d8c-be3b-bad1d61114fe_del Aug 30 14:08:32.932141 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-dff6a006-c42f-43fd-aabe-c584fbd7adab tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Deletion of /opt/stack/data/nova/instances/3e4eb957-d13b-4d8c-be3b-bad1d61114fe_del complete Aug 30 14:08:32.999988 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-dff6a006-c42f-43fd-aabe-c584fbd7adab tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Took 0.77 seconds to destroy the instance on the hypervisor. Aug 30 14:08:33.000513 np0035104604 nova-compute[107505]: DEBUG oslo.service.loopingcall [None req-dff6a006-c42f-43fd-aabe-c584fbd7adab tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=107505) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} Aug 30 14:08:33.000904 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [-] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Deallocating network for instance {{(pid=107505) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} Aug 30 14:08:33.001208 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [-] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] deallocate_for_instance() {{(pid=107505) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} Aug 30 14:08:33.708988 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [-] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Updating instance_info_cache with network_info: [] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:08:33.794890 np0035104604 nova-compute[107505]: INFO nova.compute.manager [-] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Took 0.79 seconds to deallocate network for instance. Aug 30 14:08:33.852381 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-dff6a006-c42f-43fd-aabe-c584fbd7adab tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:08:33.852843 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-dff6a006-c42f-43fd-aabe-c584fbd7adab tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:08:34.375648 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-dff6a006-c42f-43fd-aabe-c584fbd7adab tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:08:34.980622 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-a61e0f65-eab6-44f0-b4cd-c786eb4be681 req-d18b474e-5627-4381-bf13-66008eaec8a2 service nova] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Received event network-vif-plugged-614532f0-0086-4382-b9bd-08c309fc6548 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:08:34.981462 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-a61e0f65-eab6-44f0-b4cd-c786eb4be681 req-d18b474e-5627-4381-bf13-66008eaec8a2 service nova] Acquiring lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:08:34.982070 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-a61e0f65-eab6-44f0-b4cd-c786eb4be681 req-d18b474e-5627-4381-bf13-66008eaec8a2 service nova] Lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:08:34.982674 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-a61e0f65-eab6-44f0-b4cd-c786eb4be681 req-d18b474e-5627-4381-bf13-66008eaec8a2 service nova] Lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:08:34.983296 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-a61e0f65-eab6-44f0-b4cd-c786eb4be681 req-d18b474e-5627-4381-bf13-66008eaec8a2 service nova] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] No waiting events found dispatching network-vif-plugged-614532f0-0086-4382-b9bd-08c309fc6548 {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:08:34.983739 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [req-a61e0f65-eab6-44f0-b4cd-c786eb4be681 req-d18b474e-5627-4381-bf13-66008eaec8a2 service nova] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Received unexpected event network-vif-plugged-614532f0-0086-4382-b9bd-08c309fc6548 for instance with vm_state deleted and task_state None. Aug 30 14:08:34.984194 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-a61e0f65-eab6-44f0-b4cd-c786eb4be681 req-d18b474e-5627-4381-bf13-66008eaec8a2 service nova] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Received event network-vif-deleted-614532f0-0086-4382-b9bd-08c309fc6548 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:08:35.084403 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-dff6a006-c42f-43fd-aabe-c584fbd7adab tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.706s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:08:35.091496 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-dff6a006-c42f-43fd-aabe-c584fbd7adab tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:08:35.109345 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-dff6a006-c42f-43fd-aabe-c584fbd7adab tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:08:35.146127 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-dff6a006-c42f-43fd-aabe-c584fbd7adab tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.293s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:08:35.312575 np0035104604 nova-compute[107505]: INFO nova.scheduler.client.report [None req-dff6a006-c42f-43fd-aabe-c584fbd7adab tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Deleted allocations for instance 3e4eb957-d13b-4d8c-be3b-bad1d61114fe Aug 30 14:08:35.431685 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-dff6a006-c42f-43fd-aabe-c584fbd7adab tempest-ServersAdminTestJSON-499476127 tempest-ServersAdminTestJSON-499476127-project-member] Lock "3e4eb957-d13b-4d8c-be3b-bad1d61114fe" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 3.233s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:08:37.512171 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:38.676074 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:38.711538 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:08:38.711834 np0035104604 nova-compute[107505]: INFO nova.compute.manager [-] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] VM Stopped (Lifecycle Event) Aug 30 14:08:38.739854 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-7b6becda-b09b-42c2-ba3b-6aed735497c2 None None] [instance: 4918c1d5-b7cd-4eb8-8cc3-79be4a6eaabc] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:08:39.272206 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:40.781051 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:42.308404 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:42.513019 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:43.205990 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:08:43.207023 np0035104604 nova-compute[107505]: INFO nova.compute.manager [-] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] VM Stopped (Lifecycle Event) Aug 30 14:08:43.232261 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-ce4b936e-9331-4b25-914b-5eb6774d258c None None] [instance: 8f0941f9-96cc-445a-9d3b-83aa415ff950] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:08:43.677584 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:47.478796 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:08:47.479272 np0035104604 nova-compute[107505]: INFO nova.compute.manager [-] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] VM Stopped (Lifecycle Event) Aug 30 14:08:47.499895 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-11a60d1c-3b5a-4b98-90b1-6107ac902877 None None] [instance: 3e4eb957-d13b-4d8c-be3b-bad1d61114fe] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:08:47.514667 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:48.005177 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Acquiring lock "31c8d191-4fe4-444a-9e36-e7506e8ebc76" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:08:48.005509 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Lock "31c8d191-4fe4-444a-9e36-e7506e8ebc76" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:08:48.025659 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Starting instance... {{(pid=107505) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} Aug 30 14:08:48.256765 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:08:48.257297 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:08:48.268961 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=107505) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Aug 30 14:08:48.269390 np0035104604 nova-compute[107505]: INFO nova.compute.claims [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Claim successful on node np0035104604 Aug 30 14:08:48.702250 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:48.890604 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:08:49.634065 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.742s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:08:49.643238 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:08:49.660009 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:08:50.070525 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.812s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:08:50.071121 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Start building networks asynchronously for instance. {{(pid=107505) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} Aug 30 14:08:50.132964 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Allocating IP information in the background. {{(pid=107505) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} Aug 30 14:08:50.133278 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] allocate_for_instance() {{(pid=107505) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} Aug 30 14:08:50.239368 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Aug 30 14:08:50.259726 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Start building block device mappings for instance. {{(pid=107505) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} Aug 30 14:08:50.461572 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Start spawning the instance on the hypervisor. {{(pid=107505) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} Aug 30 14:08:50.462515 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Creating instance directory {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4717}} Aug 30 14:08:50.462994 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Creating image(s) Aug 30 14:08:50.488838 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] rbd image 31c8d191-4fe4-444a-9e36-e7506e8ebc76_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:08:50.520993 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] rbd image 31c8d191-4fe4-444a-9e36-e7506e8ebc76_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:08:50.554429 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] rbd image 31c8d191-4fe4-444a-9e36-e7506e8ebc76_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:08:50.559414 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e --force-share --output=json {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:08:50.606608 np0035104604 nova-compute[107505]: DEBUG nova.policy [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8202a5c338754bafb4d49f80004f2a7d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9c6fb09777a641f6a3fae7b21c67ed1a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=107505) authorize /opt/stack/nova/nova/policy.py:203}} Aug 30 14:08:50.744624 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] CMD "/opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e --force-share --output=json" returned: 0 in 0.185s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:08:50.745140 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Acquiring lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:08:50.745915 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:08:50.746461 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:08:50.772588 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] rbd image 31c8d191-4fe4-444a-9e36-e7506e8ebc76_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:08:50.776083 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e 31c8d191-4fe4-444a-9e36-e7506e8ebc76_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:08:51.116308 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] CMD "rbd import --pool vms /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e 31c8d191-4fe4-444a-9e36-e7506e8ebc76_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.339s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:08:51.201327 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] resizing rbd image 31c8d191-4fe4-444a-9e36-e7506e8ebc76_disk to 1073741824 {{(pid=107505) resize /opt/stack/nova/nova/storage/rbd_utils.py:288}} Aug 30 14:08:51.295342 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Created local disks {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4851}} Aug 30 14:08:51.295898 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Ensure instance console log exists: /opt/stack/data/nova/instances/31c8d191-4fe4-444a-9e36-e7506e8ebc76/console.log {{(pid=107505) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4603}} Aug 30 14:08:51.296500 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:08:51.297225 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:08:51.298029 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:08:51.921340 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Successfully created port: 09e56312-7954-44ef-865e-7defd97f48a5 {{(pid=107505) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} Aug 30 14:08:52.516723 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:52.777342 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Successfully updated port: 09e56312-7954-44ef-865e-7defd97f48a5 {{(pid=107505) _update_port /opt/stack/nova/nova/network/neutron.py:586}} Aug 30 14:08:52.798067 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Acquiring lock "refresh_cache-31c8d191-4fe4-444a-9e36-e7506e8ebc76" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:08:52.798296 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Acquired lock "refresh_cache-31c8d191-4fe4-444a-9e36-e7506e8ebc76" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:08:52.798693 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Building network info cache for instance {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} Aug 30 14:08:52.942144 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-06514427-83b6-4fb9-bc5e-4314a5b00781 req-e1ff18b5-5adc-4293-9bcb-f513a14f85d3 service nova] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Received event network-changed-09e56312-7954-44ef-865e-7defd97f48a5 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:08:52.942345 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-06514427-83b6-4fb9-bc5e-4314a5b00781 req-e1ff18b5-5adc-4293-9bcb-f513a14f85d3 service nova] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Refreshing instance network info cache due to event network-changed-09e56312-7954-44ef-865e-7defd97f48a5. {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11037}} Aug 30 14:08:52.942584 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-06514427-83b6-4fb9-bc5e-4314a5b00781 req-e1ff18b5-5adc-4293-9bcb-f513a14f85d3 service nova] Acquiring lock "refresh_cache-31c8d191-4fe4-444a-9e36-e7506e8ebc76" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:08:52.982799 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Instance cache missing network info. {{(pid=107505) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} Aug 30 14:08:53.393692 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:53.684777 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:53.905847 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Updating instance_info_cache with network_info: [{"id": "09e56312-7954-44ef-865e-7defd97f48a5", "address": "fa:16:3e:c7:c9:51", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.76", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap09e56312-79", "ovs_interfaceid": "09e56312-7954-44ef-865e-7defd97f48a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:08:53.929211 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Releasing lock "refresh_cache-31c8d191-4fe4-444a-9e36-e7506e8ebc76" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:08:53.929718 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Instance network_info: |[{"id": "09e56312-7954-44ef-865e-7defd97f48a5", "address": "fa:16:3e:c7:c9:51", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.76", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap09e56312-79", "ovs_interfaceid": "09e56312-7954-44ef-865e-7defd97f48a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=107505) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} Aug 30 14:08:53.930396 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-06514427-83b6-4fb9-bc5e-4314a5b00781 req-e1ff18b5-5adc-4293-9bcb-f513a14f85d3 service nova] Acquired lock "refresh_cache-31c8d191-4fe4-444a-9e36-e7506e8ebc76" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:08:53.930816 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-06514427-83b6-4fb9-bc5e-4314a5b00781 req-e1ff18b5-5adc-4293-9bcb-f513a14f85d3 service nova] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Refreshing network info cache for port 09e56312-7954-44ef-865e-7defd97f48a5 {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} Aug 30 14:08:53.937184 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Start _get_guest_xml network_info=[{"id": "09e56312-7954-44ef-865e-7defd97f48a5", "address": "fa:16:3e:c7:c9:51", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.76", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap09e56312-79", "ovs_interfaceid": "09e56312-7954-44ef-865e-7defd97f48a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'ide', 'dev': 'hda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-08-30T14:01:32Z,direct_url=,disk_format='qcow2',id=a32d892d-d3ba-4b7e-9bab-7e06f730b9e0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='7f1ed4be4a61468fa9c662d2b5fbcb56',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-08-30T14:01:43Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'disk_bus': 'virtio', 'image_id': 'a32d892d-d3ba-4b7e-9bab-7e06f730b9e0'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=107505) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7553}} Aug 30 14:08:54.056624 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Aug 30 14:08:54.061910 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Searching host: 'np0035104604' for CPU controller through CGroups V1... {{(pid=107505) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1650}} Aug 30 14:08:54.062493 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] CPU controller missing on host. {{(pid=107505) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1660}} Aug 30 14:08:54.064189 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Searching host: 'np0035104604' for CPU controller through CGroups V2... {{(pid=107505) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1669}} Aug 30 14:08:54.064517 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] CPU controller found on host. {{(pid=107505) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1676}} Aug 30 14:08:54.065909 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=107505) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5390}} Aug 30 14:08:54.066673 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-08-30T14:02:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-08-30T14:01:32Z,direct_url=,disk_format='qcow2',id=a32d892d-d3ba-4b7e-9bab-7e06f730b9e0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='7f1ed4be4a61468fa9c662d2b5fbcb56',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-08-30T14:01:43Z,virtual_size=,visibility=), allow threads: True {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Aug 30 14:08:54.067006 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Flavor limits 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Aug 30 14:08:54.067364 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Image limits 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Aug 30 14:08:54.067668 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Flavor pref 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Aug 30 14:08:54.067967 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Image pref 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Aug 30 14:08:54.068224 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Aug 30 14:08:54.068597 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Aug 30 14:08:54.069032 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=107505) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Aug 30 14:08:54.069321 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Got 1 possible topologies {{(pid=107505) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Aug 30 14:08:54.069595 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Aug 30 14:08:54.069914 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Aug 30 14:08:54.085676 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:08:54.899572 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.814s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:08:54.928234 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] rbd image 31c8d191-4fe4-444a-9e36-e7506e8ebc76_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:08:54.946658 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:08:55.516053 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-06514427-83b6-4fb9-bc5e-4314a5b00781 req-e1ff18b5-5adc-4293-9bcb-f513a14f85d3 service nova] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Updated VIF entry in instance network info cache for port 09e56312-7954-44ef-865e-7defd97f48a5. {{(pid=107505) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} Aug 30 14:08:55.516579 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-06514427-83b6-4fb9-bc5e-4314a5b00781 req-e1ff18b5-5adc-4293-9bcb-f513a14f85d3 service nova] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Updating instance_info_cache with network_info: [{"id": "09e56312-7954-44ef-865e-7defd97f48a5", "address": "fa:16:3e:c7:c9:51", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.76", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap09e56312-79", "ovs_interfaceid": "09e56312-7954-44ef-865e-7defd97f48a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:08:55.530748 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-06514427-83b6-4fb9-bc5e-4314a5b00781 req-e1ff18b5-5adc-4293-9bcb-f513a14f85d3 service nova] Releasing lock "refresh_cache-31c8d191-4fe4-444a-9e36-e7506e8ebc76" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:08:55.632805 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.686s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:08:55.636146 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-08-30T14:08:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminNegativeTestJSON-server-1311328016',display_name='tempest-ServersAdminNegativeTestJSON-server-1311328016',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-serversadminnegativetestjson-server-1311328016',id=11,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9c6fb09777a641f6a3fae7b21c67ed1a',ramdisk_id='',reservation_id='r-st0iej1s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersAdminNegativeTestJSON-1539174490',owner_user_name='tempest-ServersAdminNegativeTestJSON-1539174490-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-08-30T14:08:50Z,user_data=None,user_id='8202a5c338754bafb4d49f80004f2a7d',uuid=31c8d191-4fe4-444a-9e36-e7506e8ebc76,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "09e56312-7954-44ef-865e-7defd97f48a5", "address": "fa:16:3e:c7:c9:51", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.76", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap09e56312-79", "ovs_interfaceid": "09e56312-7954-44ef-865e-7defd97f48a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=qemu {{(pid=107505) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Aug 30 14:08:55.636870 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Converting VIF {"id": "09e56312-7954-44ef-865e-7defd97f48a5", "address": "fa:16:3e:c7:c9:51", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.76", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap09e56312-79", "ovs_interfaceid": "09e56312-7954-44ef-865e-7defd97f48a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:08:55.638424 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c7:c9:51,bridge_name='br-int',has_traffic_filtering=True,id=09e56312-7954-44ef-865e-7defd97f48a5,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09e56312-79') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:08:55.640274 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Lazy-loading 'pci_devices' on Instance uuid 31c8d191-4fe4-444a-9e36-e7506e8ebc76 {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] End _get_guest_xml xml= Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: 31c8d191-4fe4-444a-9e36-e7506e8ebc76 Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: instance-0000000b Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: 131072 Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: 1 Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: tempest-ServersAdminNegativeTestJSON-server-1311328016 Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: 2023-08-30 14:08:54 Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: 128 Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: 1 Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: 0 Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: 0 Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: 1 Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: tempest-ServersAdminNegativeTestJSON-1539174490-project-member Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: tempest-ServersAdminNegativeTestJSON-1539174490 Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: OpenStack Foundation Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: OpenStack Nova Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: 27.1.0 Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: 31c8d191-4fe4-444a-9e36-e7506e8ebc76 Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: 31c8d191-4fe4-444a-9e36-e7506e8ebc76 Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Virtual Machine Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: hvm Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Nehalem Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: /dev/urandom Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: Aug 30 14:08:55.658418 np0035104604 nova-compute[107505]: {{(pid=107505) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7559}} Aug 30 14:08:55.667023 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Preparing to wait for external event network-vif-plugged-09e56312-7954-44ef-865e-7defd97f48a5 {{(pid=107505) prepare_for_instance_event /opt/stack/nova/nova/compute/manager.py:283}} Aug 30 14:08:55.667023 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Acquiring lock "31c8d191-4fe4-444a-9e36-e7506e8ebc76-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:08:55.667023 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Lock "31c8d191-4fe4-444a-9e36-e7506e8ebc76-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:08:55.667023 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Lock "31c8d191-4fe4-444a-9e36-e7506e8ebc76-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:08:55.667023 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-08-30T14:08:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminNegativeTestJSON-server-1311328016',display_name='tempest-ServersAdminNegativeTestJSON-server-1311328016',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-serversadminnegativetestjson-server-1311328016',id=11,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9c6fb09777a641f6a3fae7b21c67ed1a',ramdisk_id='',reservation_id='r-st0iej1s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersAdminNegativeTestJSON-1539174490',owner_user_name='tempest-ServersAdminNegativeTestJSON-1539174490-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-08-30T14:08:50Z,user_data=None,user_id='8202a5c338754bafb4d49f80004f2a7d',uuid=31c8d191-4fe4-444a-9e36-e7506e8ebc76,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "09e56312-7954-44ef-865e-7defd97f48a5", "address": "fa:16:3e:c7:c9:51", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.76", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap09e56312-79", "ovs_interfaceid": "09e56312-7954-44ef-865e-7defd97f48a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Aug 30 14:08:55.668102 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Converting VIF {"id": "09e56312-7954-44ef-865e-7defd97f48a5", "address": "fa:16:3e:c7:c9:51", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.76", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap09e56312-79", "ovs_interfaceid": "09e56312-7954-44ef-865e-7defd97f48a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:08:55.668102 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c7:c9:51,bridge_name='br-int',has_traffic_filtering=True,id=09e56312-7954-44ef-865e-7defd97f48a5,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09e56312-79') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:08:55.668102 np0035104604 nova-compute[107505]: DEBUG os_vif [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c7:c9:51,bridge_name='br-int',has_traffic_filtering=True,id=09e56312-7954-44ef-865e-7defd97f48a5,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09e56312-79') {{(pid=107505) plug /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:76}} Aug 30 14:08:55.668102 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:55.668102 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:08:55.668102 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Aug 30 14:08:55.669858 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:55.669858 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap09e56312-79, may_exist=True, interface_attrs={}) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:08:55.670378 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap09e56312-79, col_values=(('external_ids', {'iface-id': '09e56312-7954-44ef-865e-7defd97f48a5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c7:c9:51', 'vm-uuid': '31c8d191-4fe4-444a-9e36-e7506e8ebc76'}),), if_exists=True) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:08:55.672899 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:55.678648 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:08:55.678984 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:55.680261 np0035104604 nova-compute[107505]: INFO os_vif [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c7:c9:51,bridge_name='br-int',has_traffic_filtering=True,id=09e56312-7954-44ef-865e-7defd97f48a5,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09e56312-79') Aug 30 14:08:55.727313 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] No BDM found with device name vda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:08:55.727767 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] No BDM found with device name hda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:08:55.728038 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] No VIF found with MAC fa:16:3e:c7:c9:51, not building metadata {{(pid=107505) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12068}} Aug 30 14:08:55.728925 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Using config drive Aug 30 14:08:55.752647 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] rbd image 31c8d191-4fe4-444a-9e36-e7506e8ebc76_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:08:56.135616 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Creating config drive at /opt/stack/data/nova/instances/31c8d191-4fe4-444a-9e36-e7506e8ebc76/disk.config Aug 30 14:08:56.142945 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Running cmd (subprocess): genisoimage -o /opt/stack/data/nova/instances/31c8d191-4fe4-444a-9e36-e7506e8ebc76/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 27.1.0 -quiet -J -r -V config-2 /tmp/tmppvq0fyij {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:08:56.176566 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] CMD "genisoimage -o /opt/stack/data/nova/instances/31c8d191-4fe4-444a-9e36-e7506e8ebc76/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 27.1.0 -quiet -J -r -V config-2 /tmp/tmppvq0fyij" returned: 0 in 0.033s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:08:56.214323 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] rbd image 31c8d191-4fe4-444a-9e36-e7506e8ebc76_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:08:56.218320 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/31c8d191-4fe4-444a-9e36-e7506e8ebc76/disk.config 31c8d191-4fe4-444a-9e36-e7506e8ebc76_disk.config --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:08:56.363462 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:56.445895 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] CMD "rbd import --pool vms /opt/stack/data/nova/instances/31c8d191-4fe4-444a-9e36-e7506e8ebc76/disk.config 31c8d191-4fe4-444a-9e36-e7506e8ebc76_disk.config --image-format=2 --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.227s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:08:56.446994 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Deleting local config drive /opt/stack/data/nova/instances/31c8d191-4fe4-444a-9e36-e7506e8ebc76/disk.config because it was imported into RBD. Aug 30 14:08:56.465434 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:56.514814 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:56.519108 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:56.870898 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:56.884057 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:56.894924 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:08:57.394946 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Started> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:08:57.395539 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] VM Started (Lifecycle Event) Aug 30 14:08:57.414041 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:08:57.420452 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Paused> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:08:57.420912 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] VM Paused (Lifecycle Event) Aug 30 14:08:57.446542 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:08:57.450910 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:08:57.468917 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] During sync_power_state the instance has a pending task (spawning). Skip. Aug 30 14:08:58.681853 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:00.673581 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:01.763471 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-5e48b33f-35b6-4f0e-b162-2e2a5e87923f req-9a2407b2-6a5d-4f06-b633-7fa77bd5d824 service nova] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Received event network-vif-plugged-09e56312-7954-44ef-865e-7defd97f48a5 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:09:01.764063 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-5e48b33f-35b6-4f0e-b162-2e2a5e87923f req-9a2407b2-6a5d-4f06-b633-7fa77bd5d824 service nova] Acquiring lock "31c8d191-4fe4-444a-9e36-e7506e8ebc76-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:09:01.765397 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-5e48b33f-35b6-4f0e-b162-2e2a5e87923f req-9a2407b2-6a5d-4f06-b633-7fa77bd5d824 service nova] Lock "31c8d191-4fe4-444a-9e36-e7506e8ebc76-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.002s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:09:01.765881 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-5e48b33f-35b6-4f0e-b162-2e2a5e87923f req-9a2407b2-6a5d-4f06-b633-7fa77bd5d824 service nova] Lock "31c8d191-4fe4-444a-9e36-e7506e8ebc76-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:09:01.766199 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-5e48b33f-35b6-4f0e-b162-2e2a5e87923f req-9a2407b2-6a5d-4f06-b633-7fa77bd5d824 service nova] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Processing event network-vif-plugged-09e56312-7954-44ef-865e-7defd97f48a5 {{(pid=107505) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10792}} Aug 30 14:09:01.767389 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Instance event wait completed in 4 seconds for network-vif-plugged {{(pid=107505) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Aug 30 14:09:01.783965 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Guest created on hypervisor {{(pid=107505) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4411}} Aug 30 14:09:01.788712 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Resumed> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:09:01.794379 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] VM Resumed (Lifecycle Event) Aug 30 14:09:01.806386 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Instance spawned successfully. Aug 30 14:09:01.807639 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:894}} Aug 30 14:09:01.832812 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:09:01.848642 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:09:01.853687 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Found default for hw_cdrom_bus of ide {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:09:01.854150 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Found default for hw_disk_bus of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:09:01.854619 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Found default for hw_input_bus of None {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:09:01.855202 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Found default for hw_pointer_model of None {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:09:01.855698 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Found default for hw_video_model of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:09:01.856215 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Found default for hw_vif_model of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:09:01.870340 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] During sync_power_state the instance has a pending task (spawning). Skip. Aug 30 14:09:01.967464 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Took 11.51 seconds to spawn the instance on the hypervisor. Aug 30 14:09:01.967983 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:09:02.054648 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Took 13.97 seconds to build instance. Aug 30 14:09:02.076817 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-654deb33-b3ad-4b67-a9af-906a68dcdcfa tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Lock "31c8d191-4fe4-444a-9e36-e7506e8ebc76" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 14.071s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:09:03.683801 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:04.003804 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-2988c750-a9f9-40e4-a7c8-7e5930346a32 req-3e91b5bb-735a-48ac-b371-39be625bddb2 service nova] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Received event network-vif-plugged-09e56312-7954-44ef-865e-7defd97f48a5 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:09:04.004128 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-2988c750-a9f9-40e4-a7c8-7e5930346a32 req-3e91b5bb-735a-48ac-b371-39be625bddb2 service nova] Acquiring lock "31c8d191-4fe4-444a-9e36-e7506e8ebc76-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:09:04.004457 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-2988c750-a9f9-40e4-a7c8-7e5930346a32 req-3e91b5bb-735a-48ac-b371-39be625bddb2 service nova] Lock "31c8d191-4fe4-444a-9e36-e7506e8ebc76-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:09:04.004779 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-2988c750-a9f9-40e4-a7c8-7e5930346a32 req-3e91b5bb-735a-48ac-b371-39be625bddb2 service nova] Lock "31c8d191-4fe4-444a-9e36-e7506e8ebc76-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:09:04.005060 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-2988c750-a9f9-40e4-a7c8-7e5930346a32 req-3e91b5bb-735a-48ac-b371-39be625bddb2 service nova] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] No waiting events found dispatching network-vif-plugged-09e56312-7954-44ef-865e-7defd97f48a5 {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:09:04.005367 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [req-2988c750-a9f9-40e4-a7c8-7e5930346a32 req-3e91b5bb-735a-48ac-b371-39be625bddb2 service nova] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Received unexpected event network-vif-plugged-09e56312-7954-44ef-865e-7defd97f48a5 for instance with vm_state active and task_state None. Aug 30 14:09:05.089509 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Acquiring lock "b23c4b18-61e2-4fd1-b593-e7d9f7488f70" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:09:05.090112 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Lock "b23c4b18-61e2-4fd1-b593-e7d9f7488f70" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:09:05.106281 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Starting instance... {{(pid=107505) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} Aug 30 14:09:05.326950 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:09:05.327807 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:09:05.335307 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=107505) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Aug 30 14:09:05.335777 np0035104604 nova-compute[107505]: INFO nova.compute.claims [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Claim successful on node np0035104604 Aug 30 14:09:05.715008 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:06.116515 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:09:06.905179 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.788s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:09:06.911024 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:09:06.926991 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:09:06.975325 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.648s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:09:06.976263 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Start building networks asynchronously for instance. {{(pid=107505) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} Aug 30 14:09:07.040579 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Allocating IP information in the background. {{(pid=107505) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} Aug 30 14:09:07.040900 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] allocate_for_instance() {{(pid=107505) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} Aug 30 14:09:07.213770 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Aug 30 14:09:07.240848 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Start building block device mappings for instance. {{(pid=107505) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} Aug 30 14:09:07.458356 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Start spawning the instance on the hypervisor. {{(pid=107505) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} Aug 30 14:09:07.460253 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Creating instance directory {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4717}} Aug 30 14:09:07.460997 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Creating image(s) Aug 30 14:09:07.503805 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] rbd image b23c4b18-61e2-4fd1-b593-e7d9f7488f70_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:09:07.548751 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] rbd image b23c4b18-61e2-4fd1-b593-e7d9f7488f70_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:09:07.592164 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] rbd image b23c4b18-61e2-4fd1-b593-e7d9f7488f70_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:09:07.598252 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e --force-share --output=json {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:09:07.624364 np0035104604 nova-compute[107505]: DEBUG nova.policy [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8202a5c338754bafb4d49f80004f2a7d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9c6fb09777a641f6a3fae7b21c67ed1a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=107505) authorize /opt/stack/nova/nova/policy.py:203}} Aug 30 14:09:07.775483 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] CMD "/opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e --force-share --output=json" returned: 0 in 0.178s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:09:07.822942 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Acquiring lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:09:07.822942 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:09:07.822942 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:09:07.822942 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] rbd image b23c4b18-61e2-4fd1-b593-e7d9f7488f70_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:09:07.822942 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e b23c4b18-61e2-4fd1-b593-e7d9f7488f70_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:09:08.025482 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager.update_available_resource {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:09:08.054968 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:09:08.055467 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:09:08.055842 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:09:08.056279 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Auditing locally available compute resources for np0035104604 (node: np0035104604) {{(pid=107505) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} Aug 30 14:09:08.056788 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:09:08.288002 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] CMD "rbd import --pool vms /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e b23c4b18-61e2-4fd1-b593-e7d9f7488f70_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:09:08.391059 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] resizing rbd image b23c4b18-61e2-4fd1-b593-e7d9f7488f70_disk to 1073741824 {{(pid=107505) resize /opt/stack/nova/nova/storage/rbd_utils.py:288}} Aug 30 14:09:08.505950 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Created local disks {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4851}} Aug 30 14:09:08.507095 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Ensure instance console log exists: /opt/stack/data/nova/instances/b23c4b18-61e2-4fd1-b593-e7d9f7488f70/console.log {{(pid=107505) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4603}} Aug 30 14:09:08.507837 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:09:08.508427 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:09:08.508920 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:09:08.684074 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:08.862770 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.806s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:09:08.963567 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-0000000b as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:09:08.963852 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-0000000b as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:09:08.969893 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-0000000a as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:09:08.970315 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-0000000a as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:09:09.040815 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Aug 30 14:09:09.041891 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Hypervisor/Node resource view: name=np0035104604 free_ram=1079MB free_disk=29.929473876953125GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "00b8", "vendor_id": "1013", "numa_node": null, "label": "label_1013_00b8", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] {{(pid=107505) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} Aug 30 14:09:09.042220 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:09:09.042421 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:09:09.282391 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:09.306487 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance f4820198-6af7-4434-b811-c208d50a5743 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:09:09.307045 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance 31c8d191-4fe4-444a-9e36-e7506e8ebc76 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:09:09.307045 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance b23c4b18-61e2-4fd1-b593-e7d9f7488f70 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:09:09.307459 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Total usable vcpus: 8, total allocated vcpus: 3 {{(pid=107505) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} Aug 30 14:09:09.307824 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Final resource view: name=np0035104604 phys_ram=7746MB used_ram=896MB phys_disk=29GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] {{(pid=107505) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} Aug 30 14:09:09.461494 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Successfully created port: 8615a5c7-fce0-4e13-bf14-4f123096a571 {{(pid=107505) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} Aug 30 14:09:10.098378 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:09:10.659483 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Successfully updated port: 8615a5c7-fce0-4e13-bf14-4f123096a571 {{(pid=107505) _update_port /opt/stack/nova/nova/network/neutron.py:586}} Aug 30 14:09:10.680368 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Acquiring lock "refresh_cache-b23c4b18-61e2-4fd1-b593-e7d9f7488f70" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:09:10.680705 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Acquired lock "refresh_cache-b23c4b18-61e2-4fd1-b593-e7d9f7488f70" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:09:10.681020 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Building network info cache for instance {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} Aug 30 14:09:10.881708 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:10.887880 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-773e8ef4-0361-49fc-9dc6-0fcfe79e5d65 req-1f13cbcb-1095-4f82-ba4f-016447ffcc92 service nova] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Received event network-changed-8615a5c7-fce0-4e13-bf14-4f123096a571 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:09:10.888900 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-773e8ef4-0361-49fc-9dc6-0fcfe79e5d65 req-1f13cbcb-1095-4f82-ba4f-016447ffcc92 service nova] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Refreshing instance network info cache due to event network-changed-8615a5c7-fce0-4e13-bf14-4f123096a571. {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11037}} Aug 30 14:09:10.890109 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-773e8ef4-0361-49fc-9dc6-0fcfe79e5d65 req-1f13cbcb-1095-4f82-ba4f-016447ffcc92 service nova] Acquiring lock "refresh_cache-b23c4b18-61e2-4fd1-b593-e7d9f7488f70" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:09:10.931229 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.834s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:09:10.941071 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:09:10.971252 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:09:11.006683 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Instance cache missing network info. {{(pid=107505) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} Aug 30 14:09:11.152885 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Compute_service record updated for np0035104604:np0035104604 {{(pid=107505) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} Aug 30 14:09:11.153259 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.111s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:09:11.904298 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Updating instance_info_cache with network_info: [{"id": "8615a5c7-fce0-4e13-bf14-4f123096a571", "address": "fa:16:3e:d3:c6:20", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8615a5c7-fc", "ovs_interfaceid": "8615a5c7-fce0-4e13-bf14-4f123096a571", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:09:11.929077 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Releasing lock "refresh_cache-b23c4b18-61e2-4fd1-b593-e7d9f7488f70" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:09:11.929863 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Instance network_info: |[{"id": "8615a5c7-fce0-4e13-bf14-4f123096a571", "address": "fa:16:3e:d3:c6:20", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8615a5c7-fc", "ovs_interfaceid": "8615a5c7-fce0-4e13-bf14-4f123096a571", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=107505) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} Aug 30 14:09:11.930324 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-773e8ef4-0361-49fc-9dc6-0fcfe79e5d65 req-1f13cbcb-1095-4f82-ba4f-016447ffcc92 service nova] Acquired lock "refresh_cache-b23c4b18-61e2-4fd1-b593-e7d9f7488f70" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:09:11.930731 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-773e8ef4-0361-49fc-9dc6-0fcfe79e5d65 req-1f13cbcb-1095-4f82-ba4f-016447ffcc92 service nova] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Refreshing network info cache for port 8615a5c7-fce0-4e13-bf14-4f123096a571 {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} Aug 30 14:09:11.937924 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Start _get_guest_xml network_info=[{"id": "8615a5c7-fce0-4e13-bf14-4f123096a571", "address": "fa:16:3e:d3:c6:20", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8615a5c7-fc", "ovs_interfaceid": "8615a5c7-fce0-4e13-bf14-4f123096a571", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'ide', 'dev': 'hda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-08-30T14:01:32Z,direct_url=,disk_format='qcow2',id=a32d892d-d3ba-4b7e-9bab-7e06f730b9e0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='7f1ed4be4a61468fa9c662d2b5fbcb56',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-08-30T14:01:43Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'disk_bus': 'virtio', 'image_id': 'a32d892d-d3ba-4b7e-9bab-7e06f730b9e0'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=107505) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7553}} Aug 30 14:09:11.944085 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Aug 30 14:09:12.068396 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Searching host: 'np0035104604' for CPU controller through CGroups V1... {{(pid=107505) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1650}} Aug 30 14:09:12.068396 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] CPU controller missing on host. {{(pid=107505) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1660}} Aug 30 14:09:12.081756 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Searching host: 'np0035104604' for CPU controller through CGroups V2... {{(pid=107505) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1669}} Aug 30 14:09:12.082686 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] CPU controller found on host. {{(pid=107505) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1676}} Aug 30 14:09:12.084544 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=107505) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5390}} Aug 30 14:09:12.085381 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-08-30T14:02:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-08-30T14:01:32Z,direct_url=,disk_format='qcow2',id=a32d892d-d3ba-4b7e-9bab-7e06f730b9e0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='7f1ed4be4a61468fa9c662d2b5fbcb56',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-08-30T14:01:43Z,virtual_size=,visibility=), allow threads: True {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Aug 30 14:09:12.086080 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Flavor limits 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Aug 30 14:09:12.086540 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Image limits 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Aug 30 14:09:12.087037 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Flavor pref 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Aug 30 14:09:12.087490 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Image pref 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Aug 30 14:09:12.087957 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Aug 30 14:09:12.088555 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Aug 30 14:09:12.089017 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=107505) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Aug 30 14:09:12.089509 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Got 1 possible topologies {{(pid=107505) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Aug 30 14:09:12.090021 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Aug 30 14:09:12.095069 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Aug 30 14:09:12.113105 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:09:12.332251 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:09:12.332251 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:09:12.332251 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Starting heal instance info cache {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9842}} Aug 30 14:09:12.332251 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Rebuilding the list of instances to heal {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9846}} Aug 30 14:09:12.332251 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Skipping network cache update for instance because it is Building. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9855}} Aug 30 14:09:12.659871 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "refresh_cache-f4820198-6af7-4434-b811-c208d50a5743" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:09:12.659871 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquired lock "refresh_cache-f4820198-6af7-4434-b811-c208d50a5743" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:09:12.659871 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: f4820198-6af7-4434-b811-c208d50a5743] Forcefully refreshing network info cache for instance {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1996}} Aug 30 14:09:12.659871 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lazy-loading 'info_cache' on Instance uuid f4820198-6af7-4434-b811-c208d50a5743 {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:09:12.912127 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.799s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:09:12.939463 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] rbd image b23c4b18-61e2-4fd1-b593-e7d9f7488f70_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:09:12.944259 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:09:13.745703 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:13.750169 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-66ca9484-181b-49b3-9db0-6d5a923b8926 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Acquiring lock "f4820198-6af7-4434-b811-c208d50a5743" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:09:13.750719 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-66ca9484-181b-49b3-9db0-6d5a923b8926 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Lock "f4820198-6af7-4434-b811-c208d50a5743" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:09:13.751202 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-66ca9484-181b-49b3-9db0-6d5a923b8926 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Acquiring lock "f4820198-6af7-4434-b811-c208d50a5743-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:09:13.751639 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-66ca9484-181b-49b3-9db0-6d5a923b8926 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Lock "f4820198-6af7-4434-b811-c208d50a5743-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:09:13.752070 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-66ca9484-181b-49b3-9db0-6d5a923b8926 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Lock "f4820198-6af7-4434-b811-c208d50a5743-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:09:13.756297 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-66ca9484-181b-49b3-9db0-6d5a923b8926 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: f4820198-6af7-4434-b811-c208d50a5743] Terminating instance Aug 30 14:09:13.759662 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-66ca9484-181b-49b3-9db0-6d5a923b8926 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: f4820198-6af7-4434-b811-c208d50a5743] Start destroying the instance on the hypervisor. {{(pid=107505) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} Aug 30 14:09:13.824535 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:13.839801 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.896s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:09:13.842696 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-08-30T14:09:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminNegativeTestJSON-server-402883287',display_name='tempest-ServersAdminNegativeTestJSON-server-402883287',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-serversadminnegativetestjson-server-402883287',id=12,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9c6fb09777a641f6a3fae7b21c67ed1a',ramdisk_id='',reservation_id='r-60ao0st3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersAdminNegativeTestJSON-1539174490',owner_user_name='tempest-ServersAdminNegativeTestJSON-1539174490-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-08-30T14:09:07Z,user_data=None,user_id='8202a5c338754bafb4d49f80004f2a7d',uuid=b23c4b18-61e2-4fd1-b593-e7d9f7488f70,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8615a5c7-fce0-4e13-bf14-4f123096a571", "address": "fa:16:3e:d3:c6:20", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8615a5c7-fc", "ovs_interfaceid": "8615a5c7-fce0-4e13-bf14-4f123096a571", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=qemu {{(pid=107505) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Aug 30 14:09:13.843690 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Converting VIF {"id": "8615a5c7-fce0-4e13-bf14-4f123096a571", "address": "fa:16:3e:d3:c6:20", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8615a5c7-fc", "ovs_interfaceid": "8615a5c7-fce0-4e13-bf14-4f123096a571", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:09:13.846871 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:c6:20,bridge_name='br-int',has_traffic_filtering=True,id=8615a5c7-fce0-4e13-bf14-4f123096a571,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8615a5c7-fc') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:09:13.846871 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Lazy-loading 'pci_devices' on Instance uuid b23c4b18-61e2-4fd1-b593-e7d9f7488f70 {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:09:13.847606 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:14.049200 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-773e8ef4-0361-49fc-9dc6-0fcfe79e5d65 req-1f13cbcb-1095-4f82-ba4f-016447ffcc92 service nova] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Updated VIF entry in instance network info cache for port 8615a5c7-fce0-4e13-bf14-4f123096a571. {{(pid=107505) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} Aug 30 14:09:14.052459 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-773e8ef4-0361-49fc-9dc6-0fcfe79e5d65 req-1f13cbcb-1095-4f82-ba4f-016447ffcc92 service nova] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Updating instance_info_cache with network_info: [{"id": "8615a5c7-fce0-4e13-bf14-4f123096a571", "address": "fa:16:3e:d3:c6:20", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8615a5c7-fc", "ovs_interfaceid": "8615a5c7-fce0-4e13-bf14-4f123096a571", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:09:14.052459 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] End _get_guest_xml xml= Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: b23c4b18-61e2-4fd1-b593-e7d9f7488f70 Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: instance-0000000c Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: 131072 Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: 1 Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: tempest-ServersAdminNegativeTestJSON-server-402883287 Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: 2023-08-30 14:09:11 Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: 128 Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: 1 Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: 0 Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: 0 Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: 1 Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: tempest-ServersAdminNegativeTestJSON-1539174490-project-member Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: tempest-ServersAdminNegativeTestJSON-1539174490 Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: OpenStack Foundation Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: OpenStack Nova Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: 27.1.0 Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: b23c4b18-61e2-4fd1-b593-e7d9f7488f70 Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: b23c4b18-61e2-4fd1-b593-e7d9f7488f70 Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Virtual Machine Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: hvm Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Nehalem Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: /dev/urandom Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: Aug 30 14:09:14.057968 np0035104604 nova-compute[107505]: {{(pid=107505) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7559}} Aug 30 14:09:14.066200 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Preparing to wait for external event network-vif-plugged-8615a5c7-fce0-4e13-bf14-4f123096a571 {{(pid=107505) prepare_for_instance_event /opt/stack/nova/nova/compute/manager.py:283}} Aug 30 14:09:14.066200 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Acquiring lock "b23c4b18-61e2-4fd1-b593-e7d9f7488f70-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:09:14.066200 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Lock "b23c4b18-61e2-4fd1-b593-e7d9f7488f70-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:09:14.066200 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Lock "b23c4b18-61e2-4fd1-b593-e7d9f7488f70-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:09:14.066200 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-08-30T14:09:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminNegativeTestJSON-server-402883287',display_name='tempest-ServersAdminNegativeTestJSON-server-402883287',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-serversadminnegativetestjson-server-402883287',id=12,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9c6fb09777a641f6a3fae7b21c67ed1a',ramdisk_id='',reservation_id='r-60ao0st3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersAdminNegativeTestJSON-1539174490',owner_user_name='tempest-ServersAdminNegativeTestJSON-1539174490-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-08-30T14:09:07Z,user_data=None,user_id='8202a5c338754bafb4d49f80004f2a7d',uuid=b23c4b18-61e2-4fd1-b593-e7d9f7488f70,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8615a5c7-fce0-4e13-bf14-4f123096a571", "address": "fa:16:3e:d3:c6:20", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8615a5c7-fc", "ovs_interfaceid": "8615a5c7-fce0-4e13-bf14-4f123096a571", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Aug 30 14:09:14.070925 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Converting VIF {"id": "8615a5c7-fce0-4e13-bf14-4f123096a571", "address": "fa:16:3e:d3:c6:20", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8615a5c7-fc", "ovs_interfaceid": "8615a5c7-fce0-4e13-bf14-4f123096a571", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:09:14.075346 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:c6:20,bridge_name='br-int',has_traffic_filtering=True,id=8615a5c7-fce0-4e13-bf14-4f123096a571,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8615a5c7-fc') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:09:14.075346 np0035104604 nova-compute[107505]: DEBUG os_vif [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:c6:20,bridge_name='br-int',has_traffic_filtering=True,id=8615a5c7-fce0-4e13-bf14-4f123096a571,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8615a5c7-fc') {{(pid=107505) plug /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:76}} Aug 30 14:09:14.077435 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:14.078058 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:09:14.078724 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Aug 30 14:09:14.080149 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-773e8ef4-0361-49fc-9dc6-0fcfe79e5d65 req-1f13cbcb-1095-4f82-ba4f-016447ffcc92 service nova] Releasing lock "refresh_cache-b23c4b18-61e2-4fd1-b593-e7d9f7488f70" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:09:14.091082 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:14.091717 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8615a5c7-fc, may_exist=True, interface_attrs={}) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:09:14.092655 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8615a5c7-fc, col_values=(('external_ids', {'iface-id': '8615a5c7-fce0-4e13-bf14-4f123096a571', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d3:c6:20', 'vm-uuid': 'b23c4b18-61e2-4fd1-b593-e7d9f7488f70'}),), if_exists=True) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:09:14.101780 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:14.108901 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: f4820198-6af7-4434-b811-c208d50a5743] Instance destroyed successfully. Aug 30 14:09:14.109294 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:09:14.114600 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-66ca9484-181b-49b3-9db0-6d5a923b8926 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Lazy-loading 'resources' on Instance uuid f4820198-6af7-4434-b811-c208d50a5743 {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:09:14.118200 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:14.124186 np0035104604 nova-compute[107505]: INFO os_vif [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:c6:20,bridge_name='br-int',has_traffic_filtering=True,id=8615a5c7-fce0-4e13-bf14-4f123096a571,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8615a5c7-fc') Aug 30 14:09:14.132251 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-66ca9484-181b-49b3-9db0-6d5a923b8926 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2023-08-30T14:07:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-348819062',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-348819062',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(35),hidden=False,host='np0035104604',hostname='tempest-serverswithspecificflavortestjson-server-348819062',id=10,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=35,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHHdXbSjxXwqDwe7VjknCsPB7FlaX9kbdidTvee1eM5xK4auSl9TOs3ulw+MzaECXIpkK+ZLwKiI+UH3J46BXA6E7xNkh4IyXHMT0+x8g5UZ8EaR+YM4WD80W1OwwUhUEg==',key_name='tempest-keypair-609321772',keypairs=,launch_index=0,launched_at=2023-08-30T14:08:06Z,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='b35dffa3f90f40559bd146f98ed3083d',ramdisk_id='',reservation_id='r-uu3dhq4q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-5414259',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-08-30T14:08:06Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d8043c2265984da88bd2252d8b2cf983',uuid=f4820198-6af7-4434-b811-c208d50a5743,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ef896d3c-f7c0-4559-859b-b05102fdb388", "address": "fa:16:3e:95:91:0b", "network": {"id": "33008139-2fd9-496e-ac17-91db64b6f4c6", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1161556951-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.5.10", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b35dffa3f90f40559bd146f98ed3083d", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapef896d3c-f7", "ovs_interfaceid": "ef896d3c-f7c0-4559-859b-b05102fdb388", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Aug 30 14:09:14.133069 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-66ca9484-181b-49b3-9db0-6d5a923b8926 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Converting VIF {"id": "ef896d3c-f7c0-4559-859b-b05102fdb388", "address": "fa:16:3e:95:91:0b", "network": {"id": "33008139-2fd9-496e-ac17-91db64b6f4c6", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1161556951-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.5.10", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b35dffa3f90f40559bd146f98ed3083d", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapef896d3c-f7", "ovs_interfaceid": "ef896d3c-f7c0-4559-859b-b05102fdb388", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:09:14.134097 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-66ca9484-181b-49b3-9db0-6d5a923b8926 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:95:91:0b,bridge_name='br-int',has_traffic_filtering=True,id=ef896d3c-f7c0-4559-859b-b05102fdb388,network=Network(33008139-2fd9-496e-ac17-91db64b6f4c6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef896d3c-f7') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:09:14.134810 np0035104604 nova-compute[107505]: DEBUG os_vif [None req-66ca9484-181b-49b3-9db0-6d5a923b8926 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:95:91:0b,bridge_name='br-int',has_traffic_filtering=True,id=ef896d3c-f7c0-4559-859b-b05102fdb388,network=Network(33008139-2fd9-496e-ac17-91db64b6f4c6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef896d3c-f7') {{(pid=107505) unplug /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:109}} Aug 30 14:09:14.140337 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:14.140985 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapef896d3c-f7, bridge=br-int, if_exists=True) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:09:14.144328 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:14.149281 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:09:14.151451 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:14.155281 np0035104604 nova-compute[107505]: INFO os_vif [None req-66ca9484-181b-49b3-9db0-6d5a923b8926 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:95:91:0b,bridge_name='br-int',has_traffic_filtering=True,id=ef896d3c-f7c0-4559-859b-b05102fdb388,network=Network(33008139-2fd9-496e-ac17-91db64b6f4c6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef896d3c-f7') Aug 30 14:09:14.333707 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] No BDM found with device name vda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:09:14.334830 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] No BDM found with device name hda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:09:14.334830 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] No VIF found with MAC fa:16:3e:d3:c6:20, not building metadata {{(pid=107505) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12068}} Aug 30 14:09:14.335525 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Using config drive Aug 30 14:09:14.383860 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] rbd image b23c4b18-61e2-4fd1-b593-e7d9f7488f70_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:09:14.400909 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-d9cff8c5-76c4-4ce5-9a14-fe86b73320c6 req-5b5b3738-0e82-41c1-9aa4-c7760c884a49 service nova] [instance: f4820198-6af7-4434-b811-c208d50a5743] Received event network-vif-unplugged-ef896d3c-f7c0-4559-859b-b05102fdb388 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:09:14.401357 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-d9cff8c5-76c4-4ce5-9a14-fe86b73320c6 req-5b5b3738-0e82-41c1-9aa4-c7760c884a49 service nova] Acquiring lock "f4820198-6af7-4434-b811-c208d50a5743-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:09:14.401908 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-d9cff8c5-76c4-4ce5-9a14-fe86b73320c6 req-5b5b3738-0e82-41c1-9aa4-c7760c884a49 service nova] Lock "f4820198-6af7-4434-b811-c208d50a5743-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:09:14.402309 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-d9cff8c5-76c4-4ce5-9a14-fe86b73320c6 req-5b5b3738-0e82-41c1-9aa4-c7760c884a49 service nova] Lock "f4820198-6af7-4434-b811-c208d50a5743-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:09:14.402713 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-d9cff8c5-76c4-4ce5-9a14-fe86b73320c6 req-5b5b3738-0e82-41c1-9aa4-c7760c884a49 service nova] [instance: f4820198-6af7-4434-b811-c208d50a5743] No waiting events found dispatching network-vif-unplugged-ef896d3c-f7c0-4559-859b-b05102fdb388 {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:09:14.403237 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-d9cff8c5-76c4-4ce5-9a14-fe86b73320c6 req-5b5b3738-0e82-41c1-9aa4-c7760c884a49 service nova] [instance: f4820198-6af7-4434-b811-c208d50a5743] Received event network-vif-unplugged-ef896d3c-f7c0-4559-859b-b05102fdb388 for instance with task_state deleting. {{(pid=107505) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10810}} Aug 30 14:09:14.878401 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-66ca9484-181b-49b3-9db0-6d5a923b8926 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: f4820198-6af7-4434-b811-c208d50a5743] Deleting instance files /opt/stack/data/nova/instances/f4820198-6af7-4434-b811-c208d50a5743_del Aug 30 14:09:14.879222 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-66ca9484-181b-49b3-9db0-6d5a923b8926 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: f4820198-6af7-4434-b811-c208d50a5743] Deletion of /opt/stack/data/nova/instances/f4820198-6af7-4434-b811-c208d50a5743_del complete Aug 30 14:09:14.929755 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Creating config drive at /opt/stack/data/nova/instances/b23c4b18-61e2-4fd1-b593-e7d9f7488f70/disk.config Aug 30 14:09:14.933977 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Running cmd (subprocess): genisoimage -o /opt/stack/data/nova/instances/b23c4b18-61e2-4fd1-b593-e7d9f7488f70/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 27.1.0 -quiet -J -r -V config-2 /tmp/tmpv6543noa {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:09:14.968432 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-66ca9484-181b-49b3-9db0-6d5a923b8926 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: f4820198-6af7-4434-b811-c208d50a5743] Took 1.21 seconds to destroy the instance on the hypervisor. Aug 30 14:09:14.969335 np0035104604 nova-compute[107505]: DEBUG oslo.service.loopingcall [None req-66ca9484-181b-49b3-9db0-6d5a923b8926 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=107505) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} Aug 30 14:09:14.971339 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] CMD "genisoimage -o /opt/stack/data/nova/instances/b23c4b18-61e2-4fd1-b593-e7d9f7488f70/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 27.1.0 -quiet -J -r -V config-2 /tmp/tmpv6543noa" returned: 0 in 0.037s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:09:14.971954 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [-] [instance: f4820198-6af7-4434-b811-c208d50a5743] Deallocating network for instance {{(pid=107505) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} Aug 30 14:09:14.972336 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [-] [instance: f4820198-6af7-4434-b811-c208d50a5743] deallocate_for_instance() {{(pid=107505) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} Aug 30 14:09:15.014339 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] rbd image b23c4b18-61e2-4fd1-b593-e7d9f7488f70_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:09:15.018443 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/b23c4b18-61e2-4fd1-b593-e7d9f7488f70/disk.config b23c4b18-61e2-4fd1-b593-e7d9f7488f70_disk.config --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:09:15.204376 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] CMD "rbd import --pool vms /opt/stack/data/nova/instances/b23c4b18-61e2-4fd1-b593-e7d9f7488f70/disk.config b23c4b18-61e2-4fd1-b593-e7d9f7488f70_disk.config --image-format=2 --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.186s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:09:15.329652 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Deleting local config drive /opt/stack/data/nova/instances/b23c4b18-61e2-4fd1-b593-e7d9f7488f70/disk.config because it was imported into RBD. Aug 30 14:09:15.469419 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:15.547764 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:15.569143 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: f4820198-6af7-4434-b811-c208d50a5743] Updating instance_info_cache with network_info: [{"id": "ef896d3c-f7c0-4559-859b-b05102fdb388", "address": "fa:16:3e:95:91:0b", "network": {"id": "33008139-2fd9-496e-ac17-91db64b6f4c6", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1161556951-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.5.10", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b35dffa3f90f40559bd146f98ed3083d", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapef896d3c-f7", "ovs_interfaceid": "ef896d3c-f7c0-4559-859b-b05102fdb388", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:09:15.588508 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Releasing lock "refresh_cache-f4820198-6af7-4434-b811-c208d50a5743" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:09:15.588915 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: f4820198-6af7-4434-b811-c208d50a5743] Updated the network info_cache for instance {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} Aug 30 14:09:15.589718 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:09:15.590168 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:09:15.590558 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:09:15.591073 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:09:15.591807 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:09:15.592780 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:09:15.593123 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=107505) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10461}} Aug 30 14:09:16.037656 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-3c6c03a8-e015-44a2-bd8e-61f52d1d224a req-144fd46d-a2b3-40c4-bed7-b268eda04628 service nova] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Received event network-vif-plugged-8615a5c7-fce0-4e13-bf14-4f123096a571 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:09:16.038752 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-3c6c03a8-e015-44a2-bd8e-61f52d1d224a req-144fd46d-a2b3-40c4-bed7-b268eda04628 service nova] Acquiring lock "b23c4b18-61e2-4fd1-b593-e7d9f7488f70-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:09:16.039277 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-3c6c03a8-e015-44a2-bd8e-61f52d1d224a req-144fd46d-a2b3-40c4-bed7-b268eda04628 service nova] Lock "b23c4b18-61e2-4fd1-b593-e7d9f7488f70-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:09:16.039700 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-3c6c03a8-e015-44a2-bd8e-61f52d1d224a req-144fd46d-a2b3-40c4-bed7-b268eda04628 service nova] Lock "b23c4b18-61e2-4fd1-b593-e7d9f7488f70-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:09:16.040129 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-3c6c03a8-e015-44a2-bd8e-61f52d1d224a req-144fd46d-a2b3-40c4-bed7-b268eda04628 service nova] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Processing event network-vif-plugged-8615a5c7-fce0-4e13-bf14-4f123096a571 {{(pid=107505) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10792}} Aug 30 14:09:16.380743 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Started> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:09:16.381176 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] VM Started (Lifecycle Event) Aug 30 14:09:16.385297 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Acquiring lock "cd08791f-846f-4204-97a6-eb1701ed0723" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:09:16.385652 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Lock "cd08791f-846f-4204-97a6-eb1701ed0723" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:09:16.388190 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Instance event wait completed in 0 seconds for network-vif-plugged {{(pid=107505) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Aug 30 14:09:16.392245 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Guest created on hypervisor {{(pid=107505) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4411}} Aug 30 14:09:16.418095 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:09:16.420520 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Starting instance... {{(pid=107505) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} Aug 30 14:09:16.441282 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Instance spawned successfully. Aug 30 14:09:16.442058 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:894}} Aug 30 14:09:16.446137 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:09:16.472140 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Found default for hw_cdrom_bus of ide {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:09:16.472424 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Found default for hw_disk_bus of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:09:16.473172 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Found default for hw_input_bus of None {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:09:16.473695 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Found default for hw_pointer_model of None {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:09:16.474510 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Found default for hw_video_model of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:09:16.475127 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Found default for hw_vif_model of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:09:16.481389 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-8f4817de-7e6f-4346-a3e5-c0471c845887 req-7dd08f06-1308-4e99-ad92-7dc85982a1ee service nova] [instance: f4820198-6af7-4434-b811-c208d50a5743] Received event network-vif-plugged-ef896d3c-f7c0-4559-859b-b05102fdb388 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:09:16.481676 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-8f4817de-7e6f-4346-a3e5-c0471c845887 req-7dd08f06-1308-4e99-ad92-7dc85982a1ee service nova] Acquiring lock "f4820198-6af7-4434-b811-c208d50a5743-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:09:16.482004 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-8f4817de-7e6f-4346-a3e5-c0471c845887 req-7dd08f06-1308-4e99-ad92-7dc85982a1ee service nova] Lock "f4820198-6af7-4434-b811-c208d50a5743-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:09:16.482315 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-8f4817de-7e6f-4346-a3e5-c0471c845887 req-7dd08f06-1308-4e99-ad92-7dc85982a1ee service nova] Lock "f4820198-6af7-4434-b811-c208d50a5743-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:09:16.483078 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-8f4817de-7e6f-4346-a3e5-c0471c845887 req-7dd08f06-1308-4e99-ad92-7dc85982a1ee service nova] [instance: f4820198-6af7-4434-b811-c208d50a5743] No waiting events found dispatching network-vif-plugged-ef896d3c-f7c0-4559-859b-b05102fdb388 {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:09:16.483367 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [req-8f4817de-7e6f-4346-a3e5-c0471c845887 req-7dd08f06-1308-4e99-ad92-7dc85982a1ee service nova] [instance: f4820198-6af7-4434-b811-c208d50a5743] Received unexpected event network-vif-plugged-ef896d3c-f7c0-4559-859b-b05102fdb388 for instance with vm_state active and task_state deleting. Aug 30 14:09:16.483832 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] During sync_power_state the instance has a pending task (spawning). Skip. Aug 30 14:09:16.484166 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Paused> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:09:16.484383 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] VM Paused (Lifecycle Event) Aug 30 14:09:16.508629 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:09:16.514815 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Resumed> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:09:16.515069 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] VM Resumed (Lifecycle Event) Aug 30 14:09:16.667381 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:09:16.669996 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Took 9.21 seconds to spawn the instance on the hypervisor. Aug 30 14:09:16.670619 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:09:16.676253 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:09:16.693811 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:09:16.694673 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:09:16.710378 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] During sync_power_state the instance has a pending task (spawning). Skip. Aug 30 14:09:16.716359 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=107505) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Aug 30 14:09:16.717322 np0035104604 nova-compute[107505]: INFO nova.compute.claims [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Claim successful on node np0035104604 Aug 30 14:09:16.779652 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Took 11.62 seconds to build instance. Aug 30 14:09:16.802206 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-3258a42d-a494-4b79-be2e-1b5c4fe1d634 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Lock "b23c4b18-61e2-4fd1-b593-e7d9f7488f70" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 11.712s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:09:17.006824 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [-] [instance: f4820198-6af7-4434-b811-c208d50a5743] Updating instance_info_cache with network_info: [] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:09:17.235058 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:17.237440 np0035104604 nova-compute[107505]: INFO nova.compute.manager [-] [instance: f4820198-6af7-4434-b811-c208d50a5743] Took 2.27 seconds to deallocate network for instance. Aug 30 14:09:17.389917 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:09:17.391350 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-66ca9484-181b-49b3-9db0-6d5a923b8926 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:09:17.705620 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:09:18.106089 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-fa9af813-e3ff-4e4c-9100-d3bc75d1c8e6 req-30afb4e5-7a4a-499d-afe1-0e3163514929 service nova] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Received event network-vif-plugged-8615a5c7-fce0-4e13-bf14-4f123096a571 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:09:18.106089 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-fa9af813-e3ff-4e4c-9100-d3bc75d1c8e6 req-30afb4e5-7a4a-499d-afe1-0e3163514929 service nova] Acquiring lock "b23c4b18-61e2-4fd1-b593-e7d9f7488f70-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:09:18.106089 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-fa9af813-e3ff-4e4c-9100-d3bc75d1c8e6 req-30afb4e5-7a4a-499d-afe1-0e3163514929 service nova] Lock "b23c4b18-61e2-4fd1-b593-e7d9f7488f70-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:09:18.106089 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-fa9af813-e3ff-4e4c-9100-d3bc75d1c8e6 req-30afb4e5-7a4a-499d-afe1-0e3163514929 service nova] Lock "b23c4b18-61e2-4fd1-b593-e7d9f7488f70-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:09:18.106089 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-fa9af813-e3ff-4e4c-9100-d3bc75d1c8e6 req-30afb4e5-7a4a-499d-afe1-0e3163514929 service nova] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] No waiting events found dispatching network-vif-plugged-8615a5c7-fce0-4e13-bf14-4f123096a571 {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:09:18.106691 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [req-fa9af813-e3ff-4e4c-9100-d3bc75d1c8e6 req-30afb4e5-7a4a-499d-afe1-0e3163514929 service nova] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Received unexpected event network-vif-plugged-8615a5c7-fce0-4e13-bf14-4f123096a571 for instance with vm_state active and task_state suspending. Aug 30 14:09:18.107295 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-fa9af813-e3ff-4e4c-9100-d3bc75d1c8e6 req-30afb4e5-7a4a-499d-afe1-0e3163514929 service nova] [instance: f4820198-6af7-4434-b811-c208d50a5743] Received event network-vif-deleted-ef896d3c-f7c0-4559-859b-b05102fdb388 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:09:18.687721 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:19.144422 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:23.689810 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:24.147048 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:28.691652 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:29.055274 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:09:29.055274 np0035104604 nova-compute[107505]: INFO nova.compute.manager [-] [instance: f4820198-6af7-4434-b811-c208d50a5743] VM Stopped (Lifecycle Event) Aug 30 14:09:29.078052 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-7a0ac253-f4c0-4e65-9c50-f34ec9cebc1c None None] [instance: f4820198-6af7-4434-b811-c208d50a5743] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:09:29.148711 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:33.693362 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:34.151012 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:39.154000 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:40.176478 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-cd11b870-d57b-49b7-87b4-d17edbdd4e67 tempest-ServersAdminNegativeTestJSON-1141073960 tempest-ServersAdminNegativeTestJSON-1141073960-project-admin] Lazy-loading 'pci_devices' on Instance uuid b23c4b18-61e2-4fd1-b593-e7d9f7488f70 {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:09:40.229395 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:40.267240 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Paused> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:09:40.267651 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] VM Paused (Lifecycle Event) Aug 30 14:09:40.301885 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:09:40.339109 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:09:40.378868 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] During sync_power_state the instance has a pending task (suspending). Skip. Aug 30 14:09:40.781534 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 23.075s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:09:40.788796 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:09:40.810774 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:09:40.849216 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 24.155s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:09:40.849943 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Start building networks asynchronously for instance. {{(pid=107505) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} Aug 30 14:09:40.871169 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-66ca9484-181b-49b3-9db0-6d5a923b8926 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 23.478s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:09:40.871169 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:41.008111 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:41.016391 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Allocating IP information in the background. {{(pid=107505) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} Aug 30 14:09:41.016667 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] allocate_for_instance() {{(pid=107505) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} Aug 30 14:09:41.166474 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:41.178451 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Aug 30 14:09:41.182601 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-cd11b870-d57b-49b7-87b4-d17edbdd4e67 tempest-ServersAdminNegativeTestJSON-1141073960 tempest-ServersAdminNegativeTestJSON-1141073960-project-admin] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:09:41.299570 np0035104604 nova-compute[107505]: DEBUG nova.policy [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd8043c2265984da88bd2252d8b2cf983', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b35dffa3f90f40559bd146f98ed3083d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=107505) authorize /opt/stack/nova/nova/policy.py:203}} Aug 30 14:09:41.301788 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:41.309509 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-6dbda8ef-85cb-46bd-bfe5-7b2e97635348 req-9538fef6-237d-4fea-90f6-6025e7394690 service nova] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Received event network-vif-unplugged-8615a5c7-fce0-4e13-bf14-4f123096a571 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:09:41.310017 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-6dbda8ef-85cb-46bd-bfe5-7b2e97635348 req-9538fef6-237d-4fea-90f6-6025e7394690 service nova] Acquiring lock "b23c4b18-61e2-4fd1-b593-e7d9f7488f70-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:09:41.310482 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-6dbda8ef-85cb-46bd-bfe5-7b2e97635348 req-9538fef6-237d-4fea-90f6-6025e7394690 service nova] Lock "b23c4b18-61e2-4fd1-b593-e7d9f7488f70-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:09:41.310811 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-6dbda8ef-85cb-46bd-bfe5-7b2e97635348 req-9538fef6-237d-4fea-90f6-6025e7394690 service nova] Lock "b23c4b18-61e2-4fd1-b593-e7d9f7488f70-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:09:41.311236 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-6dbda8ef-85cb-46bd-bfe5-7b2e97635348 req-9538fef6-237d-4fea-90f6-6025e7394690 service nova] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] No waiting events found dispatching network-vif-unplugged-8615a5c7-fce0-4e13-bf14-4f123096a571 {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:09:41.311650 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [req-6dbda8ef-85cb-46bd-bfe5-7b2e97635348 req-9538fef6-237d-4fea-90f6-6025e7394690 service nova] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Received unexpected event network-vif-unplugged-8615a5c7-fce0-4e13-bf14-4f123096a571 for instance with vm_state active and task_state suspending. Aug 30 14:09:41.443430 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Start building block device mappings for instance. {{(pid=107505) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} Aug 30 14:09:41.449043 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:41.777169 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Start spawning the instance on the hypervisor. {{(pid=107505) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} Aug 30 14:09:41.778056 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Creating instance directory {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4717}} Aug 30 14:09:41.778471 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Creating image(s) Aug 30 14:09:41.819926 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] rbd image cd08791f-846f-4204-97a6-eb1701ed0723_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:09:41.848467 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] rbd image cd08791f-846f-4204-97a6-eb1701ed0723_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:09:41.887288 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] rbd image cd08791f-846f-4204-97a6-eb1701ed0723_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:09:41.891295 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e --force-share --output=json {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:09:42.045740 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] CMD "/opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e --force-share --output=json" returned: 0 in 0.154s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:09:42.046428 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Acquiring lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:09:42.047401 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:09:42.047780 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:09:42.075246 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] rbd image cd08791f-846f-4204-97a6-eb1701ed0723_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:09:42.079976 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e cd08791f-846f-4204-97a6-eb1701ed0723_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:09:42.306920 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-66ca9484-181b-49b3-9db0-6d5a923b8926 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:09:42.401191 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] CMD "rbd import --pool vms /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e cd08791f-846f-4204-97a6-eb1701ed0723_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.320s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:09:42.452978 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Successfully created port: fa0425cc-6d23-4e9d-90d9-c26582dd918a {{(pid=107505) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} Aug 30 14:09:42.649068 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] resizing rbd image cd08791f-846f-4204-97a6-eb1701ed0723_disk to 1073741824 {{(pid=107505) resize /opt/stack/nova/nova/storage/rbd_utils.py:288}} Aug 30 14:09:42.945633 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] rbd image cd08791f-846f-4204-97a6-eb1701ed0723_disk.eph0 does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:09:42.997958 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] rbd image cd08791f-846f-4204-97a6-eb1701ed0723_disk.eph0 does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:09:43.003402 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Acquiring lock "ephemeral_1_40d1d2c" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:09:43.004818 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Lock "ephemeral_1_40d1d2c" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.002s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:09:43.005467 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /opt/stack/data/nova/instances/_base/ephemeral_1_40d1d2c 1G {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:09:43.050740 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /opt/stack/data/nova/instances/_base/ephemeral_1_40d1d2c 1G" returned: 0 in 0.045s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:09:43.051393 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Running cmd (subprocess): mkfs -t ext4 -F -L ephemeral0 /opt/stack/data/nova/instances/_base/ephemeral_1_40d1d2c {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:09:43.092700 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] CMD "mkfs -t ext4 -F -L ephemeral0 /opt/stack/data/nova/instances/_base/ephemeral_1_40d1d2c" returned: 0 in 0.041s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:09:43.093873 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Lock "ephemeral_1_40d1d2c" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.089s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:09:43.136636 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] rbd image cd08791f-846f-4204-97a6-eb1701ed0723_disk.eph0 does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:09:43.140140 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/_base/ephemeral_1_40d1d2c cd08791f-846f-4204-97a6-eb1701ed0723_disk.eph0 --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:09:43.304022 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-66ca9484-181b-49b3-9db0-6d5a923b8926 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.996s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:09:43.312286 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-66ca9484-181b-49b3-9db0-6d5a923b8926 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:09:43.338199 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-66ca9484-181b-49b3-9db0-6d5a923b8926 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:09:43.411750 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-66ca9484-181b-49b3-9db0-6d5a923b8926 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 2.542s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:09:43.696923 np0035104604 nova-compute[107505]: INFO nova.scheduler.client.report [None req-66ca9484-181b-49b3-9db0-6d5a923b8926 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Deleted allocations for instance f4820198-6af7-4434-b811-c208d50a5743 Aug 30 14:09:43.748373 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-85699390-5b36-4e72-a502-7484a4ea3195 req-393ee40f-d35c-41b0-9e24-1a3e68c72446 service nova] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Received event network-vif-plugged-8615a5c7-fce0-4e13-bf14-4f123096a571 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:09:43.749910 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-85699390-5b36-4e72-a502-7484a4ea3195 req-393ee40f-d35c-41b0-9e24-1a3e68c72446 service nova] Acquiring lock "b23c4b18-61e2-4fd1-b593-e7d9f7488f70-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:09:43.750421 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-85699390-5b36-4e72-a502-7484a4ea3195 req-393ee40f-d35c-41b0-9e24-1a3e68c72446 service nova] Lock "b23c4b18-61e2-4fd1-b593-e7d9f7488f70-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.002s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:09:43.750708 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-85699390-5b36-4e72-a502-7484a4ea3195 req-393ee40f-d35c-41b0-9e24-1a3e68c72446 service nova] Lock "b23c4b18-61e2-4fd1-b593-e7d9f7488f70-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:09:43.752207 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-85699390-5b36-4e72-a502-7484a4ea3195 req-393ee40f-d35c-41b0-9e24-1a3e68c72446 service nova] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] No waiting events found dispatching network-vif-plugged-8615a5c7-fce0-4e13-bf14-4f123096a571 {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:09:43.752506 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [req-85699390-5b36-4e72-a502-7484a4ea3195 req-393ee40f-d35c-41b0-9e24-1a3e68c72446 service nova] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Received unexpected event network-vif-plugged-8615a5c7-fce0-4e13-bf14-4f123096a571 for instance with vm_state suspended and task_state None. Aug 30 14:09:43.755151 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-85699390-5b36-4e72-a502-7484a4ea3195 req-393ee40f-d35c-41b0-9e24-1a3e68c72446 service nova] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Received event network-vif-plugged-8615a5c7-fce0-4e13-bf14-4f123096a571 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:09:43.755813 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-85699390-5b36-4e72-a502-7484a4ea3195 req-393ee40f-d35c-41b0-9e24-1a3e68c72446 service nova] Acquiring lock "b23c4b18-61e2-4fd1-b593-e7d9f7488f70-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:09:43.756404 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-85699390-5b36-4e72-a502-7484a4ea3195 req-393ee40f-d35c-41b0-9e24-1a3e68c72446 service nova] Lock "b23c4b18-61e2-4fd1-b593-e7d9f7488f70-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:09:43.756882 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-85699390-5b36-4e72-a502-7484a4ea3195 req-393ee40f-d35c-41b0-9e24-1a3e68c72446 service nova] Lock "b23c4b18-61e2-4fd1-b593-e7d9f7488f70-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:09:43.759419 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-85699390-5b36-4e72-a502-7484a4ea3195 req-393ee40f-d35c-41b0-9e24-1a3e68c72446 service nova] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] No waiting events found dispatching network-vif-plugged-8615a5c7-fce0-4e13-bf14-4f123096a571 {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:09:43.760121 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [req-85699390-5b36-4e72-a502-7484a4ea3195 req-393ee40f-d35c-41b0-9e24-1a3e68c72446 service nova] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Received unexpected event network-vif-plugged-8615a5c7-fce0-4e13-bf14-4f123096a571 for instance with vm_state suspended and task_state None. Aug 30 14:09:43.762635 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-85699390-5b36-4e72-a502-7484a4ea3195 req-393ee40f-d35c-41b0-9e24-1a3e68c72446 service nova] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Received event network-vif-plugged-8615a5c7-fce0-4e13-bf14-4f123096a571 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:09:43.763528 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-85699390-5b36-4e72-a502-7484a4ea3195 req-393ee40f-d35c-41b0-9e24-1a3e68c72446 service nova] Acquiring lock "b23c4b18-61e2-4fd1-b593-e7d9f7488f70-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:09:43.764828 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-85699390-5b36-4e72-a502-7484a4ea3195 req-393ee40f-d35c-41b0-9e24-1a3e68c72446 service nova] Lock "b23c4b18-61e2-4fd1-b593-e7d9f7488f70-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:09:43.766330 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-85699390-5b36-4e72-a502-7484a4ea3195 req-393ee40f-d35c-41b0-9e24-1a3e68c72446 service nova] Lock "b23c4b18-61e2-4fd1-b593-e7d9f7488f70-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.002s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:09:43.766848 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-85699390-5b36-4e72-a502-7484a4ea3195 req-393ee40f-d35c-41b0-9e24-1a3e68c72446 service nova] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] No waiting events found dispatching network-vif-plugged-8615a5c7-fce0-4e13-bf14-4f123096a571 {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:09:43.767288 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [req-85699390-5b36-4e72-a502-7484a4ea3195 req-393ee40f-d35c-41b0-9e24-1a3e68c72446 service nova] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Received unexpected event network-vif-plugged-8615a5c7-fce0-4e13-bf14-4f123096a571 for instance with vm_state suspended and task_state None. Aug 30 14:09:43.894019 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-66ca9484-181b-49b3-9db0-6d5a923b8926 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Lock "f4820198-6af7-4434-b811-c208d50a5743" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 30.143s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:09:44.157788 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:44.176409 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Successfully updated port: fa0425cc-6d23-4e9d-90d9-c26582dd918a {{(pid=107505) _update_port /opt/stack/nova/nova/network/neutron.py:586}} Aug 30 14:09:44.190344 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Acquiring lock "refresh_cache-cd08791f-846f-4204-97a6-eb1701ed0723" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:09:44.190344 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Acquired lock "refresh_cache-cd08791f-846f-4204-97a6-eb1701ed0723" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:09:44.190344 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Building network info cache for instance {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} Aug 30 14:09:44.382182 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-58f10679-f4c2-4c0a-8ad4-a4bfcec9a6ba req-5cc856f5-d715-4531-96b4-fe21cf1de8ab service nova] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Received event network-changed-fa0425cc-6d23-4e9d-90d9-c26582dd918a {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:09:44.382705 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-58f10679-f4c2-4c0a-8ad4-a4bfcec9a6ba req-5cc856f5-d715-4531-96b4-fe21cf1de8ab service nova] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Refreshing instance network info cache due to event network-changed-fa0425cc-6d23-4e9d-90d9-c26582dd918a. {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11037}} Aug 30 14:09:44.382839 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-58f10679-f4c2-4c0a-8ad4-a4bfcec9a6ba req-5cc856f5-d715-4531-96b4-fe21cf1de8ab service nova] Acquiring lock "refresh_cache-cd08791f-846f-4204-97a6-eb1701ed0723" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:09:44.476682 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Instance cache missing network info. {{(pid=107505) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} Aug 30 14:09:44.724123 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:44.938864 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] CMD "rbd import --pool vms /opt/stack/data/nova/instances/_base/ephemeral_1_40d1d2c cd08791f-846f-4204-97a6-eb1701ed0723_disk.eph0 --image-format=2 --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 1.799s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:09:45.256523 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Created local disks {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4851}} Aug 30 14:09:45.257498 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Ensure instance console log exists: /opt/stack/data/nova/instances/cd08791f-846f-4204-97a6-eb1701ed0723/console.log {{(pid=107505) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4603}} Aug 30 14:09:45.259083 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:09:45.260359 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:09:45.261313 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:09:45.669939 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-63bf4e20-de19-49e1-880d-aae8c66ea3a8 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Acquiring lock "b23c4b18-61e2-4fd1-b593-e7d9f7488f70" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:09:45.670676 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-63bf4e20-de19-49e1-880d-aae8c66ea3a8 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Lock "b23c4b18-61e2-4fd1-b593-e7d9f7488f70" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:09:45.671654 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-63bf4e20-de19-49e1-880d-aae8c66ea3a8 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Acquiring lock "b23c4b18-61e2-4fd1-b593-e7d9f7488f70-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:09:45.672116 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-63bf4e20-de19-49e1-880d-aae8c66ea3a8 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Lock "b23c4b18-61e2-4fd1-b593-e7d9f7488f70-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:09:45.672570 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-63bf4e20-de19-49e1-880d-aae8c66ea3a8 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Lock "b23c4b18-61e2-4fd1-b593-e7d9f7488f70-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:09:45.676393 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-63bf4e20-de19-49e1-880d-aae8c66ea3a8 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Terminating instance Aug 30 14:09:45.680323 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-63bf4e20-de19-49e1-880d-aae8c66ea3a8 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Start destroying the instance on the hypervisor. {{(pid=107505) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} Aug 30 14:09:45.690203 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Instance destroyed successfully. Aug 30 14:09:45.690965 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-63bf4e20-de19-49e1-880d-aae8c66ea3a8 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Lazy-loading 'resources' on Instance uuid b23c4b18-61e2-4fd1-b593-e7d9f7488f70 {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:09:45.708938 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-63bf4e20-de19-49e1-880d-aae8c66ea3a8 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2023-08-30T14:09:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServersAdminNegativeTestJSON-server-402883287',display_name='tempest-ServersAdminNegativeTestJSON-server-402883287',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-serversadminnegativetestjson-server-402883287',id=12,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-08-30T14:09:16Z,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=4,progress=0,project_id='9c6fb09777a641f6a3fae7b21c67ed1a',ramdisk_id='',reservation_id='r-60ao0st3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',old_vm_state='active',owner_project_name='tempest-ServersAdminNegativeTestJSON-1539174490',owner_user_name='tempest-ServersAdminNegativeTestJSON-1539174490-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-08-30T14:09:42Z,user_data=None,user_id='8202a5c338754bafb4d49f80004f2a7d',uuid=b23c4b18-61e2-4fd1-b593-e7d9f7488f70,vcpu_model=,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "8615a5c7-fce0-4e13-bf14-4f123096a571", "address": "fa:16:3e:d3:c6:20", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8615a5c7-fc", "ovs_interfaceid": "8615a5c7-fce0-4e13-bf14-4f123096a571", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Aug 30 14:09:45.709443 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-63bf4e20-de19-49e1-880d-aae8c66ea3a8 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Converting VIF {"id": "8615a5c7-fce0-4e13-bf14-4f123096a571", "address": "fa:16:3e:d3:c6:20", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8615a5c7-fc", "ovs_interfaceid": "8615a5c7-fce0-4e13-bf14-4f123096a571", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:09:45.710916 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-63bf4e20-de19-49e1-880d-aae8c66ea3a8 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:c6:20,bridge_name='br-int',has_traffic_filtering=True,id=8615a5c7-fce0-4e13-bf14-4f123096a571,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8615a5c7-fc') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:09:45.711621 np0035104604 nova-compute[107505]: DEBUG os_vif [None req-63bf4e20-de19-49e1-880d-aae8c66ea3a8 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:c6:20,bridge_name='br-int',has_traffic_filtering=True,id=8615a5c7-fce0-4e13-bf14-4f123096a571,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8615a5c7-fc') {{(pid=107505) unplug /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:109}} Aug 30 14:09:45.715930 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:45.716166 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8615a5c7-fc, bridge=br-int, if_exists=True) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:09:45.718455 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:45.721322 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:09:45.726219 np0035104604 nova-compute[107505]: INFO os_vif [None req-63bf4e20-de19-49e1-880d-aae8c66ea3a8 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:c6:20,bridge_name='br-int',has_traffic_filtering=True,id=8615a5c7-fce0-4e13-bf14-4f123096a571,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8615a5c7-fc') Aug 30 14:09:45.751353 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Updating instance_info_cache with network_info: [{"id": "fa0425cc-6d23-4e9d-90d9-c26582dd918a", "address": "fa:16:3e:86:83:a5", "network": {"id": "33008139-2fd9-496e-ac17-91db64b6f4c6", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1161556951-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b35dffa3f90f40559bd146f98ed3083d", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa0425cc-6d", "ovs_interfaceid": "fa0425cc-6d23-4e9d-90d9-c26582dd918a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:09:45.786164 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Releasing lock "refresh_cache-cd08791f-846f-4204-97a6-eb1701ed0723" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:09:45.786725 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Instance network_info: |[{"id": "fa0425cc-6d23-4e9d-90d9-c26582dd918a", "address": "fa:16:3e:86:83:a5", "network": {"id": "33008139-2fd9-496e-ac17-91db64b6f4c6", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1161556951-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b35dffa3f90f40559bd146f98ed3083d", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa0425cc-6d", "ovs_interfaceid": "fa0425cc-6d23-4e9d-90d9-c26582dd918a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=107505) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} Aug 30 14:09:45.788216 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-58f10679-f4c2-4c0a-8ad4-a4bfcec9a6ba req-5cc856f5-d715-4531-96b4-fe21cf1de8ab service nova] Acquired lock "refresh_cache-cd08791f-846f-4204-97a6-eb1701ed0723" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:09:45.788708 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-58f10679-f4c2-4c0a-8ad4-a4bfcec9a6ba req-5cc856f5-d715-4531-96b4-fe21cf1de8ab service nova] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Refreshing network info cache for port fa0425cc-6d23-4e9d-90d9-c26582dd918a {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} Aug 30 14:09:45.795181 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Start _get_guest_xml network_info=[{"id": "fa0425cc-6d23-4e9d-90d9-c26582dd918a", "address": "fa:16:3e:86:83:a5", "network": {"id": "33008139-2fd9-496e-ac17-91db64b6f4c6", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1161556951-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b35dffa3f90f40559bd146f98ed3083d", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa0425cc-6d", "ovs_interfaceid": "fa0425cc-6d23-4e9d-90d9-c26582dd918a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.eph0': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config': {'bus': 'ide', 'dev': 'hda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-08-30T14:01:32Z,direct_url=,disk_format='qcow2',id=a32d892d-d3ba-4b7e-9bab-7e06f730b9e0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='7f1ed4be4a61468fa9c662d2b5fbcb56',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-08-30T14:01:43Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'disk_bus': 'virtio', 'image_id': 'a32d892d-d3ba-4b7e-9bab-7e06f730b9e0'}], 'ephemerals': [{'encryption_options': None, 'guest_format': None, 'device_name': '/dev/vdb', 'device_type': 'disk', 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 1, 'disk_bus': 'virtio'}], 'block_device_mapping': [], 'swap': None} {{(pid=107505) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7553}} Aug 30 14:09:45.802095 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Aug 30 14:09:45.946351 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Searching host: 'np0035104604' for CPU controller through CGroups V1... {{(pid=107505) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1650}} Aug 30 14:09:45.947140 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] CPU controller missing on host. {{(pid=107505) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1660}} Aug 30 14:09:45.950609 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Searching host: 'np0035104604' for CPU controller through CGroups V2... {{(pid=107505) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1669}} Aug 30 14:09:45.951087 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] CPU controller found on host. {{(pid=107505) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1676}} Aug 30 14:09:45.952852 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=107505) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5390}} Aug 30 14:09:45.953558 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-08-30T14:07:48Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=1,extra_specs={hw_rng:allowed='True'},flavorid='585470641',id=34,is_public=True,memory_mb=128,name='tempest-flavor_with_ephemeral_1-1338430923',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-08-30T14:01:32Z,direct_url=,disk_format='qcow2',id=a32d892d-d3ba-4b7e-9bab-7e06f730b9e0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='7f1ed4be4a61468fa9c662d2b5fbcb56',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-08-30T14:01:43Z,virtual_size=,visibility=), allow threads: True {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Aug 30 14:09:45.954035 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Flavor limits 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Aug 30 14:09:45.954420 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Image limits 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Aug 30 14:09:45.954884 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Flavor pref 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Aug 30 14:09:45.955215 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Image pref 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Aug 30 14:09:45.955601 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Aug 30 14:09:45.956020 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Aug 30 14:09:45.956403 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=107505) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Aug 30 14:09:45.957782 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Got 1 possible topologies {{(pid=107505) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Aug 30 14:09:45.957782 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Aug 30 14:09:45.957782 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Aug 30 14:09:45.979974 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:09:46.190255 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-63bf4e20-de19-49e1-880d-aae8c66ea3a8 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Deleting instance files /opt/stack/data/nova/instances/b23c4b18-61e2-4fd1-b593-e7d9f7488f70_del Aug 30 14:09:46.190255 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-63bf4e20-de19-49e1-880d-aae8c66ea3a8 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Deletion of /opt/stack/data/nova/instances/b23c4b18-61e2-4fd1-b593-e7d9f7488f70_del complete Aug 30 14:09:46.387147 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-423fd85c-477d-4ccf-8ee8-f3eea54bf157 req-d48b7c4a-a6e8-45fe-9590-dac351630ce5 service nova] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Received event network-vif-unplugged-8615a5c7-fce0-4e13-bf14-4f123096a571 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:09:46.387773 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-423fd85c-477d-4ccf-8ee8-f3eea54bf157 req-d48b7c4a-a6e8-45fe-9590-dac351630ce5 service nova] Acquiring lock "b23c4b18-61e2-4fd1-b593-e7d9f7488f70-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:09:46.387864 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-423fd85c-477d-4ccf-8ee8-f3eea54bf157 req-d48b7c4a-a6e8-45fe-9590-dac351630ce5 service nova] Lock "b23c4b18-61e2-4fd1-b593-e7d9f7488f70-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:09:46.388144 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-423fd85c-477d-4ccf-8ee8-f3eea54bf157 req-d48b7c4a-a6e8-45fe-9590-dac351630ce5 service nova] Lock "b23c4b18-61e2-4fd1-b593-e7d9f7488f70-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:09:46.388366 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-423fd85c-477d-4ccf-8ee8-f3eea54bf157 req-d48b7c4a-a6e8-45fe-9590-dac351630ce5 service nova] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] No waiting events found dispatching network-vif-unplugged-8615a5c7-fce0-4e13-bf14-4f123096a571 {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:09:46.388734 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-423fd85c-477d-4ccf-8ee8-f3eea54bf157 req-d48b7c4a-a6e8-45fe-9590-dac351630ce5 service nova] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Received event network-vif-unplugged-8615a5c7-fce0-4e13-bf14-4f123096a571 for instance with task_state deleting. {{(pid=107505) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10810}} Aug 30 14:09:46.389031 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-423fd85c-477d-4ccf-8ee8-f3eea54bf157 req-d48b7c4a-a6e8-45fe-9590-dac351630ce5 service nova] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Received event network-vif-plugged-8615a5c7-fce0-4e13-bf14-4f123096a571 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:09:46.389399 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-423fd85c-477d-4ccf-8ee8-f3eea54bf157 req-d48b7c4a-a6e8-45fe-9590-dac351630ce5 service nova] Acquiring lock "b23c4b18-61e2-4fd1-b593-e7d9f7488f70-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:09:46.389764 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-423fd85c-477d-4ccf-8ee8-f3eea54bf157 req-d48b7c4a-a6e8-45fe-9590-dac351630ce5 service nova] Lock "b23c4b18-61e2-4fd1-b593-e7d9f7488f70-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:09:46.390025 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-423fd85c-477d-4ccf-8ee8-f3eea54bf157 req-d48b7c4a-a6e8-45fe-9590-dac351630ce5 service nova] Lock "b23c4b18-61e2-4fd1-b593-e7d9f7488f70-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:09:46.390307 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-423fd85c-477d-4ccf-8ee8-f3eea54bf157 req-d48b7c4a-a6e8-45fe-9590-dac351630ce5 service nova] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] No waiting events found dispatching network-vif-plugged-8615a5c7-fce0-4e13-bf14-4f123096a571 {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:09:46.390570 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [req-423fd85c-477d-4ccf-8ee8-f3eea54bf157 req-d48b7c4a-a6e8-45fe-9590-dac351630ce5 service nova] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Received unexpected event network-vif-plugged-8615a5c7-fce0-4e13-bf14-4f123096a571 for instance with vm_state suspended and task_state deleting. Aug 30 14:09:46.427540 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-63bf4e20-de19-49e1-880d-aae8c66ea3a8 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Took 0.75 seconds to destroy the instance on the hypervisor. Aug 30 14:09:46.428081 np0035104604 nova-compute[107505]: DEBUG oslo.service.loopingcall [None req-63bf4e20-de19-49e1-880d-aae8c66ea3a8 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=107505) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} Aug 30 14:09:46.428563 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [-] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Deallocating network for instance {{(pid=107505) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} Aug 30 14:09:46.428751 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [-] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] deallocate_for_instance() {{(pid=107505) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} Aug 30 14:09:46.974713 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.995s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:09:46.975735 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:09:47.440657 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [-] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Updating instance_info_cache with network_info: [] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:09:47.456693 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-ac7253da-6a4c-4af9-8291-d6c76f090485 req-accd2163-05d1-41fd-83dd-620ac16dbb5e service nova] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Received event network-vif-deleted-8615a5c7-fce0-4e13-bf14-4f123096a571 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:09:47.457034 np0035104604 nova-compute[107505]: INFO nova.compute.manager [req-ac7253da-6a4c-4af9-8291-d6c76f090485 req-accd2163-05d1-41fd-83dd-620ac16dbb5e service nova] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Neutron deleted interface 8615a5c7-fce0-4e13-bf14-4f123096a571; detaching it from the instance and deleting it from the info cache Aug 30 14:09:47.457233 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-ac7253da-6a4c-4af9-8291-d6c76f090485 req-accd2163-05d1-41fd-83dd-620ac16dbb5e service nova] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Updating instance_info_cache with network_info: [] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:09:47.464140 np0035104604 nova-compute[107505]: INFO nova.compute.manager [-] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Took 1.04 seconds to deallocate network for instance. Aug 30 14:09:47.482269 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-ac7253da-6a4c-4af9-8291-d6c76f090485 req-accd2163-05d1-41fd-83dd-620ac16dbb5e service nova] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Detach interface failed, port_id=8615a5c7-fce0-4e13-bf14-4f123096a571, reason: Instance b23c4b18-61e2-4fd1-b593-e7d9f7488f70 could not be found. {{(pid=107505) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10866}} Aug 30 14:09:47.492050 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-58f10679-f4c2-4c0a-8ad4-a4bfcec9a6ba req-5cc856f5-d715-4531-96b4-fe21cf1de8ab service nova] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Updated VIF entry in instance network info cache for port fa0425cc-6d23-4e9d-90d9-c26582dd918a. {{(pid=107505) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} Aug 30 14:09:47.492818 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-58f10679-f4c2-4c0a-8ad4-a4bfcec9a6ba req-5cc856f5-d715-4531-96b4-fe21cf1de8ab service nova] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Updating instance_info_cache with network_info: [{"id": "fa0425cc-6d23-4e9d-90d9-c26582dd918a", "address": "fa:16:3e:86:83:a5", "network": {"id": "33008139-2fd9-496e-ac17-91db64b6f4c6", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1161556951-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b35dffa3f90f40559bd146f98ed3083d", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa0425cc-6d", "ovs_interfaceid": "fa0425cc-6d23-4e9d-90d9-c26582dd918a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:09:47.511767 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-58f10679-f4c2-4c0a-8ad4-a4bfcec9a6ba req-5cc856f5-d715-4531-96b4-fe21cf1de8ab service nova] Releasing lock "refresh_cache-cd08791f-846f-4204-97a6-eb1701ed0723" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:09:47.539176 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-63bf4e20-de19-49e1-880d-aae8c66ea3a8 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:09:47.539700 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-63bf4e20-de19-49e1-880d-aae8c66ea3a8 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:09:47.824169 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.848s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:09:47.865010 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] rbd image cd08791f-846f-4204-97a6-eb1701ed0723_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:09:47.870259 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:09:48.237681 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-63bf4e20-de19-49e1-880d-aae8c66ea3a8 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:09:48.697307 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:48.701221 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.831s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:09:48.703891 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-08-30T14:09:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-851385372',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-851385372',ec2_ids=EC2Ids,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=Flavor(34),hidden=False,host='np0035104604',hostname='tempest-serverswithspecificflavortestjson-server-851385372',id=13,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=34,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHHdXbSjxXwqDwe7VjknCsPB7FlaX9kbdidTvee1eM5xK4auSl9TOs3ulw+MzaECXIpkK+ZLwKiI+UH3J46BXA6E7xNkh4IyXHMT0+x8g5UZ8EaR+YM4WD80W1OwwUhUEg==',key_name='tempest-keypair-609321772',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b35dffa3f90f40559bd146f98ed3083d',ramdisk_id='',reservation_id='r-aqmfaonr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-5414259',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-08-30T14:09:41Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d8043c2265984da88bd2252d8b2cf983',uuid=cd08791f-846f-4204-97a6-eb1701ed0723,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fa0425cc-6d23-4e9d-90d9-c26582dd918a", "address": "fa:16:3e:86:83:a5", "network": {"id": "33008139-2fd9-496e-ac17-91db64b6f4c6", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1161556951-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b35dffa3f90f40559bd146f98ed3083d", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa0425cc-6d", "ovs_interfaceid": "fa0425cc-6d23-4e9d-90d9-c26582dd918a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=qemu {{(pid=107505) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Aug 30 14:09:48.704433 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Converting VIF {"id": "fa0425cc-6d23-4e9d-90d9-c26582dd918a", "address": "fa:16:3e:86:83:a5", "network": {"id": "33008139-2fd9-496e-ac17-91db64b6f4c6", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1161556951-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b35dffa3f90f40559bd146f98ed3083d", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa0425cc-6d", "ovs_interfaceid": "fa0425cc-6d23-4e9d-90d9-c26582dd918a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:09:48.715218 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:86:83:a5,bridge_name='br-int',has_traffic_filtering=True,id=fa0425cc-6d23-4e9d-90d9-c26582dd918a,network=Network(33008139-2fd9-496e-ac17-91db64b6f4c6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa0425cc-6d') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:09:48.715218 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Lazy-loading 'pci_devices' on Instance uuid cd08791f-846f-4204-97a6-eb1701ed0723 {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] End _get_guest_xml xml= Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: cd08791f-846f-4204-97a6-eb1701ed0723 Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: instance-0000000d Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: 131072 Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: 1 Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: tempest-ServersWithSpecificFlavorTestJSON-server-851385372 Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: 2023-08-30 14:09:45 Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: 128 Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: 1 Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: 0 Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: 1 Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: 1 Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: tempest-ServersWithSpecificFlavorTestJSON-5414259 Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: OpenStack Foundation Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: OpenStack Nova Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: 27.1.0 Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: cd08791f-846f-4204-97a6-eb1701ed0723 Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: cd08791f-846f-4204-97a6-eb1701ed0723 Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Virtual Machine Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: hvm Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Nehalem Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: /dev/urandom Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: Aug 30 14:09:48.732451 np0035104604 nova-compute[107505]: {{(pid=107505) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7559}} Aug 30 14:09:48.740257 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Preparing to wait for external event network-vif-plugged-fa0425cc-6d23-4e9d-90d9-c26582dd918a {{(pid=107505) prepare_for_instance_event /opt/stack/nova/nova/compute/manager.py:283}} Aug 30 14:09:48.740257 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Acquiring lock "cd08791f-846f-4204-97a6-eb1701ed0723-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:09:48.740257 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Lock "cd08791f-846f-4204-97a6-eb1701ed0723-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:09:48.740257 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Lock "cd08791f-846f-4204-97a6-eb1701ed0723-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:09:48.740257 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-08-30T14:09:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-851385372',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-851385372',ec2_ids=EC2Ids,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=Flavor(34),hidden=False,host='np0035104604',hostname='tempest-serverswithspecificflavortestjson-server-851385372',id=13,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=34,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHHdXbSjxXwqDwe7VjknCsPB7FlaX9kbdidTvee1eM5xK4auSl9TOs3ulw+MzaECXIpkK+ZLwKiI+UH3J46BXA6E7xNkh4IyXHMT0+x8g5UZ8EaR+YM4WD80W1OwwUhUEg==',key_name='tempest-keypair-609321772',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b35dffa3f90f40559bd146f98ed3083d',ramdisk_id='',reservation_id='r-aqmfaonr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-5414259',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-08-30T14:09:41Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d8043c2265984da88bd2252d8b2cf983',uuid=cd08791f-846f-4204-97a6-eb1701ed0723,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fa0425cc-6d23-4e9d-90d9-c26582dd918a", "address": "fa:16:3e:86:83:a5", "network": {"id": "33008139-2fd9-496e-ac17-91db64b6f4c6", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1161556951-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b35dffa3f90f40559bd146f98ed3083d", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa0425cc-6d", "ovs_interfaceid": "fa0425cc-6d23-4e9d-90d9-c26582dd918a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Aug 30 14:09:48.740738 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Converting VIF {"id": "fa0425cc-6d23-4e9d-90d9-c26582dd918a", "address": "fa:16:3e:86:83:a5", "network": {"id": "33008139-2fd9-496e-ac17-91db64b6f4c6", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1161556951-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b35dffa3f90f40559bd146f98ed3083d", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa0425cc-6d", "ovs_interfaceid": "fa0425cc-6d23-4e9d-90d9-c26582dd918a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:09:48.740738 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:86:83:a5,bridge_name='br-int',has_traffic_filtering=True,id=fa0425cc-6d23-4e9d-90d9-c26582dd918a,network=Network(33008139-2fd9-496e-ac17-91db64b6f4c6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa0425cc-6d') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:09:48.740738 np0035104604 nova-compute[107505]: DEBUG os_vif [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:86:83:a5,bridge_name='br-int',has_traffic_filtering=True,id=fa0425cc-6d23-4e9d-90d9-c26582dd918a,network=Network(33008139-2fd9-496e-ac17-91db64b6f4c6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa0425cc-6d') {{(pid=107505) plug /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:76}} Aug 30 14:09:48.740738 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:48.740738 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:09:48.740738 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Aug 30 14:09:48.742793 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:48.743082 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfa0425cc-6d, may_exist=True, interface_attrs={}) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:09:48.743637 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfa0425cc-6d, col_values=(('external_ids', {'iface-id': 'fa0425cc-6d23-4e9d-90d9-c26582dd918a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:86:83:a5', 'vm-uuid': 'cd08791f-846f-4204-97a6-eb1701ed0723'}),), if_exists=True) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:09:48.746204 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:48.748048 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:09:48.752811 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:48.756812 np0035104604 nova-compute[107505]: INFO os_vif [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:86:83:a5,bridge_name='br-int',has_traffic_filtering=True,id=fa0425cc-6d23-4e9d-90d9-c26582dd918a,network=Network(33008139-2fd9-496e-ac17-91db64b6f4c6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa0425cc-6d') Aug 30 14:09:48.805172 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] No BDM found with device name vda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:09:48.805458 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] No BDM found with device name vdb, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:09:48.805813 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] No BDM found with device name hda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:09:48.806050 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] No VIF found with MAC fa:16:3e:86:83:a5, not building metadata {{(pid=107505) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12068}} Aug 30 14:09:48.806989 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Using config drive Aug 30 14:09:48.839648 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] rbd image cd08791f-846f-4204-97a6-eb1701ed0723_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:09:49.010176 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-63bf4e20-de19-49e1-880d-aae8c66ea3a8 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.773s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:09:49.015454 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-63bf4e20-de19-49e1-880d-aae8c66ea3a8 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:09:49.033812 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-63bf4e20-de19-49e1-880d-aae8c66ea3a8 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:09:49.066335 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-63bf4e20-de19-49e1-880d-aae8c66ea3a8 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.526s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:09:49.363456 np0035104604 nova-compute[107505]: INFO nova.scheduler.client.report [None req-63bf4e20-de19-49e1-880d-aae8c66ea3a8 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Deleted allocations for instance b23c4b18-61e2-4fd1-b593-e7d9f7488f70 Aug 30 14:09:49.408820 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Creating config drive at /opt/stack/data/nova/instances/cd08791f-846f-4204-97a6-eb1701ed0723/disk.config Aug 30 14:09:49.418862 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Running cmd (subprocess): genisoimage -o /opt/stack/data/nova/instances/cd08791f-846f-4204-97a6-eb1701ed0723/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 27.1.0 -quiet -J -r -V config-2 /tmp/tmpvc1j8y27 {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:09:49.477865 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:49.480953 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] CMD "genisoimage -o /opt/stack/data/nova/instances/cd08791f-846f-4204-97a6-eb1701ed0723/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 27.1.0 -quiet -J -r -V config-2 /tmp/tmpvc1j8y27" returned: 0 in 0.062s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:09:49.519619 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] rbd image cd08791f-846f-4204-97a6-eb1701ed0723_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:09:49.523567 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/cd08791f-846f-4204-97a6-eb1701ed0723/disk.config cd08791f-846f-4204-97a6-eb1701ed0723_disk.config --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:09:49.545895 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-63bf4e20-de19-49e1-880d-aae8c66ea3a8 tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Lock "b23c4b18-61e2-4fd1-b593-e7d9f7488f70" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 3.874s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:09:49.699217 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] CMD "rbd import --pool vms /opt/stack/data/nova/instances/cd08791f-846f-4204-97a6-eb1701ed0723/disk.config cd08791f-846f-4204-97a6-eb1701ed0723_disk.config --image-format=2 --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.176s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:09:49.699730 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Deleting local config drive /opt/stack/data/nova/instances/cd08791f-846f-4204-97a6-eb1701ed0723/disk.config because it was imported into RBD. Aug 30 14:09:49.719154 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:49.733675 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:49.738719 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:50.080499 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-30030f5d-06ea-4313-9b48-e5f6e9044c2f tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Acquiring lock "31c8d191-4fe4-444a-9e36-e7506e8ebc76" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:09:50.081447 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-30030f5d-06ea-4313-9b48-e5f6e9044c2f tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Lock "31c8d191-4fe4-444a-9e36-e7506e8ebc76" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:09:50.082164 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-30030f5d-06ea-4313-9b48-e5f6e9044c2f tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Acquiring lock "31c8d191-4fe4-444a-9e36-e7506e8ebc76-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:09:50.083006 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-30030f5d-06ea-4313-9b48-e5f6e9044c2f tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Lock "31c8d191-4fe4-444a-9e36-e7506e8ebc76-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:09:50.084027 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-30030f5d-06ea-4313-9b48-e5f6e9044c2f tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Lock "31c8d191-4fe4-444a-9e36-e7506e8ebc76-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:09:50.094510 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-30030f5d-06ea-4313-9b48-e5f6e9044c2f tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Terminating instance Aug 30 14:09:50.100880 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-30030f5d-06ea-4313-9b48-e5f6e9044c2f tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Start destroying the instance on the hypervisor. {{(pid=107505) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} Aug 30 14:09:50.151150 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:50.169169 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:50.245787 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:50.272393 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:50.275985 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:50.292077 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:50.305412 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:50.502970 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-1514e071-a4f2-4a37-aee4-7463c2ccd628 req-b1878c86-d72b-411d-af1d-305caba687e0 service nova] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Received event network-vif-plugged-fa0425cc-6d23-4e9d-90d9-c26582dd918a {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:09:50.503452 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-1514e071-a4f2-4a37-aee4-7463c2ccd628 req-b1878c86-d72b-411d-af1d-305caba687e0 service nova] Acquiring lock "cd08791f-846f-4204-97a6-eb1701ed0723-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:09:50.503882 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-1514e071-a4f2-4a37-aee4-7463c2ccd628 req-b1878c86-d72b-411d-af1d-305caba687e0 service nova] Lock "cd08791f-846f-4204-97a6-eb1701ed0723-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:09:50.504296 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-1514e071-a4f2-4a37-aee4-7463c2ccd628 req-b1878c86-d72b-411d-af1d-305caba687e0 service nova] Lock "cd08791f-846f-4204-97a6-eb1701ed0723-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:09:50.504702 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-1514e071-a4f2-4a37-aee4-7463c2ccd628 req-b1878c86-d72b-411d-af1d-305caba687e0 service nova] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Processing event network-vif-plugged-fa0425cc-6d23-4e9d-90d9-c26582dd918a {{(pid=107505) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10792}} Aug 30 14:09:50.558318 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Instance destroyed successfully. Aug 30 14:09:50.559756 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-30030f5d-06ea-4313-9b48-e5f6e9044c2f tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Lazy-loading 'resources' on Instance uuid 31c8d191-4fe4-444a-9e36-e7506e8ebc76 {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:09:50.576443 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-30030f5d-06ea-4313-9b48-e5f6e9044c2f tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2023-08-30T14:08:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServersAdminNegativeTestJSON-server-1311328016',display_name='tempest-ServersAdminNegativeTestJSON-server-1311328016',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-serversadminnegativetestjson-server-1311328016',id=11,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-08-30T14:09:01Z,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='9c6fb09777a641f6a3fae7b21c67ed1a',ramdisk_id='',reservation_id='r-st0iej1s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServersAdminNegativeTestJSON-1539174490',owner_user_name='tempest-ServersAdminNegativeTestJSON-1539174490-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-08-30T14:09:02Z,user_data=None,user_id='8202a5c338754bafb4d49f80004f2a7d',uuid=31c8d191-4fe4-444a-9e36-e7506e8ebc76,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "09e56312-7954-44ef-865e-7defd97f48a5", "address": "fa:16:3e:c7:c9:51", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.76", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap09e56312-79", "ovs_interfaceid": "09e56312-7954-44ef-865e-7defd97f48a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Aug 30 14:09:50.576443 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-30030f5d-06ea-4313-9b48-e5f6e9044c2f tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Converting VIF {"id": "09e56312-7954-44ef-865e-7defd97f48a5", "address": "fa:16:3e:c7:c9:51", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.76", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap09e56312-79", "ovs_interfaceid": "09e56312-7954-44ef-865e-7defd97f48a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:09:50.578110 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-30030f5d-06ea-4313-9b48-e5f6e9044c2f tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c7:c9:51,bridge_name='br-int',has_traffic_filtering=True,id=09e56312-7954-44ef-865e-7defd97f48a5,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09e56312-79') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:09:50.578989 np0035104604 nova-compute[107505]: DEBUG os_vif [None req-30030f5d-06ea-4313-9b48-e5f6e9044c2f tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c7:c9:51,bridge_name='br-int',has_traffic_filtering=True,id=09e56312-7954-44ef-865e-7defd97f48a5,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09e56312-79') {{(pid=107505) unplug /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:109}} Aug 30 14:09:50.584137 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:50.584655 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap09e56312-79, bridge=br-int, if_exists=True) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:09:50.587389 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:50.592677 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:09:50.600540 np0035104604 nova-compute[107505]: INFO os_vif [None req-30030f5d-06ea-4313-9b48-e5f6e9044c2f tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c7:c9:51,bridge_name='br-int',has_traffic_filtering=True,id=09e56312-7954-44ef-865e-7defd97f48a5,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09e56312-79') Aug 30 14:09:50.944730 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Started> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:09:50.945430 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] VM Started (Lifecycle Event) Aug 30 14:09:50.949266 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Instance event wait completed in 0 seconds for network-vif-plugged {{(pid=107505) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Aug 30 14:09:50.965298 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Guest created on hypervisor {{(pid=107505) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4411}} Aug 30 14:09:50.970101 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Instance spawned successfully. Aug 30 14:09:50.972612 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:894}} Aug 30 14:09:50.976144 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:09:50.982874 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:09:51.004182 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Found default for hw_cdrom_bus of ide {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:09:51.004437 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Found default for hw_disk_bus of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:09:51.005023 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Found default for hw_input_bus of None {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:09:51.005509 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Found default for hw_pointer_model of None {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:09:51.006268 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Found default for hw_video_model of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:09:51.006817 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Found default for hw_vif_model of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:09:51.023923 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] During sync_power_state the instance has a pending task (spawning). Skip. Aug 30 14:09:51.024428 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Paused> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:09:51.024777 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] VM Paused (Lifecycle Event) Aug 30 14:09:51.072821 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:09:51.083685 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-30030f5d-06ea-4313-9b48-e5f6e9044c2f tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Deleting instance files /opt/stack/data/nova/instances/31c8d191-4fe4-444a-9e36-e7506e8ebc76_del Aug 30 14:09:51.084444 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-30030f5d-06ea-4313-9b48-e5f6e9044c2f tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Deletion of /opt/stack/data/nova/instances/31c8d191-4fe4-444a-9e36-e7506e8ebc76_del complete Aug 30 14:09:51.090511 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Resumed> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:09:51.090867 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] VM Resumed (Lifecycle Event) Aug 30 14:09:51.112410 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:09:51.117887 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:09:51.157547 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-30030f5d-06ea-4313-9b48-e5f6e9044c2f tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Took 1.06 seconds to destroy the instance on the hypervisor. Aug 30 14:09:51.158199 np0035104604 nova-compute[107505]: DEBUG oslo.service.loopingcall [None req-30030f5d-06ea-4313-9b48-e5f6e9044c2f tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=107505) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} Aug 30 14:09:51.158560 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [-] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Deallocating network for instance {{(pid=107505) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} Aug 30 14:09:51.158787 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [-] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] deallocate_for_instance() {{(pid=107505) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} Aug 30 14:09:51.318485 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Took 9.54 seconds to spawn the instance on the hypervisor. Aug 30 14:09:51.323162 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:09:51.323162 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] During sync_power_state the instance has a pending task (spawning). Skip. Aug 30 14:09:51.438974 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Took 34.92 seconds to build instance. Aug 30 14:09:51.464250 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-08584c3b-19ab-41b1-a719-1d7425aacdf0 tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Lock "cd08791f-846f-4204-97a6-eb1701ed0723" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 35.078s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:09:52.053974 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [-] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Updating instance_info_cache with network_info: [] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:09:52.078426 np0035104604 nova-compute[107505]: INFO nova.compute.manager [-] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Took 0.92 seconds to deallocate network for instance. Aug 30 14:09:52.161669 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-30030f5d-06ea-4313-9b48-e5f6e9044c2f tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:09:52.162045 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-30030f5d-06ea-4313-9b48-e5f6e9044c2f tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:09:52.613135 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-ff9410d8-5545-4657-a070-8edbb918394e req-87ab8080-0b8b-466f-ba50-64bfba1a2536 service nova] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Received event network-vif-plugged-fa0425cc-6d23-4e9d-90d9-c26582dd918a {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:09:52.613235 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-ff9410d8-5545-4657-a070-8edbb918394e req-87ab8080-0b8b-466f-ba50-64bfba1a2536 service nova] Acquiring lock "cd08791f-846f-4204-97a6-eb1701ed0723-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:09:52.613550 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-ff9410d8-5545-4657-a070-8edbb918394e req-87ab8080-0b8b-466f-ba50-64bfba1a2536 service nova] Lock "cd08791f-846f-4204-97a6-eb1701ed0723-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:09:52.613788 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-ff9410d8-5545-4657-a070-8edbb918394e req-87ab8080-0b8b-466f-ba50-64bfba1a2536 service nova] Lock "cd08791f-846f-4204-97a6-eb1701ed0723-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:09:52.614053 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-ff9410d8-5545-4657-a070-8edbb918394e req-87ab8080-0b8b-466f-ba50-64bfba1a2536 service nova] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] No waiting events found dispatching network-vif-plugged-fa0425cc-6d23-4e9d-90d9-c26582dd918a {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:09:52.614338 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [req-ff9410d8-5545-4657-a070-8edbb918394e req-87ab8080-0b8b-466f-ba50-64bfba1a2536 service nova] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Received unexpected event network-vif-plugged-fa0425cc-6d23-4e9d-90d9-c26582dd918a for instance with vm_state active and task_state None. Aug 30 14:09:52.614591 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-ff9410d8-5545-4657-a070-8edbb918394e req-87ab8080-0b8b-466f-ba50-64bfba1a2536 service nova] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Received event network-vif-unplugged-09e56312-7954-44ef-865e-7defd97f48a5 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:09:52.614930 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-ff9410d8-5545-4657-a070-8edbb918394e req-87ab8080-0b8b-466f-ba50-64bfba1a2536 service nova] Acquiring lock "31c8d191-4fe4-444a-9e36-e7506e8ebc76-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:09:52.615383 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-ff9410d8-5545-4657-a070-8edbb918394e req-87ab8080-0b8b-466f-ba50-64bfba1a2536 service nova] Lock "31c8d191-4fe4-444a-9e36-e7506e8ebc76-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:09:52.615633 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-ff9410d8-5545-4657-a070-8edbb918394e req-87ab8080-0b8b-466f-ba50-64bfba1a2536 service nova] Lock "31c8d191-4fe4-444a-9e36-e7506e8ebc76-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:09:52.615868 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-ff9410d8-5545-4657-a070-8edbb918394e req-87ab8080-0b8b-466f-ba50-64bfba1a2536 service nova] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] No waiting events found dispatching network-vif-unplugged-09e56312-7954-44ef-865e-7defd97f48a5 {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:09:52.616124 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [req-ff9410d8-5545-4657-a070-8edbb918394e req-87ab8080-0b8b-466f-ba50-64bfba1a2536 service nova] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Received unexpected event network-vif-unplugged-09e56312-7954-44ef-865e-7defd97f48a5 for instance with vm_state deleted and task_state None. Aug 30 14:09:52.616508 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-ff9410d8-5545-4657-a070-8edbb918394e req-87ab8080-0b8b-466f-ba50-64bfba1a2536 service nova] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Received event network-vif-plugged-09e56312-7954-44ef-865e-7defd97f48a5 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:09:52.616749 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-ff9410d8-5545-4657-a070-8edbb918394e req-87ab8080-0b8b-466f-ba50-64bfba1a2536 service nova] Acquiring lock "31c8d191-4fe4-444a-9e36-e7506e8ebc76-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:09:52.617019 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-ff9410d8-5545-4657-a070-8edbb918394e req-87ab8080-0b8b-466f-ba50-64bfba1a2536 service nova] Lock "31c8d191-4fe4-444a-9e36-e7506e8ebc76-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:09:52.617588 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-ff9410d8-5545-4657-a070-8edbb918394e req-87ab8080-0b8b-466f-ba50-64bfba1a2536 service nova] Lock "31c8d191-4fe4-444a-9e36-e7506e8ebc76-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:09:52.617912 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-ff9410d8-5545-4657-a070-8edbb918394e req-87ab8080-0b8b-466f-ba50-64bfba1a2536 service nova] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] No waiting events found dispatching network-vif-plugged-09e56312-7954-44ef-865e-7defd97f48a5 {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:09:52.618256 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [req-ff9410d8-5545-4657-a070-8edbb918394e req-87ab8080-0b8b-466f-ba50-64bfba1a2536 service nova] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Received unexpected event network-vif-plugged-09e56312-7954-44ef-865e-7defd97f48a5 for instance with vm_state deleted and task_state None. Aug 30 14:09:52.618518 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-ff9410d8-5545-4657-a070-8edbb918394e req-87ab8080-0b8b-466f-ba50-64bfba1a2536 service nova] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Received event network-vif-deleted-09e56312-7954-44ef-865e-7defd97f48a5 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:09:52.644721 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-30030f5d-06ea-4313-9b48-e5f6e9044c2f tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:09:53.509911 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-30030f5d-06ea-4313-9b48-e5f6e9044c2f tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.865s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:09:53.516756 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-30030f5d-06ea-4313-9b48-e5f6e9044c2f tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:09:53.532499 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-30030f5d-06ea-4313-9b48-e5f6e9044c2f tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:09:53.565561 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-30030f5d-06ea-4313-9b48-e5f6e9044c2f tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.403s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:09:53.699127 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:53.735927 np0035104604 nova-compute[107505]: INFO nova.scheduler.client.report [None req-30030f5d-06ea-4313-9b48-e5f6e9044c2f tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Deleted allocations for instance 31c8d191-4fe4-444a-9e36-e7506e8ebc76 Aug 30 14:09:53.821254 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-30030f5d-06ea-4313-9b48-e5f6e9044c2f tempest-ServersAdminNegativeTestJSON-1539174490 tempest-ServersAdminNegativeTestJSON-1539174490-project-member] Lock "31c8d191-4fe4-444a-9e36-e7506e8ebc76" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 3.740s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:09:54.105834 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-0087d185-38d7-403f-b743-b29f9e83aedf req-86d53923-92da-493c-b14b-4877fed904e4 service nova] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Received event network-changed-fa0425cc-6d23-4e9d-90d9-c26582dd918a {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:09:54.107299 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-0087d185-38d7-403f-b743-b29f9e83aedf req-86d53923-92da-493c-b14b-4877fed904e4 service nova] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Refreshing instance network info cache due to event network-changed-fa0425cc-6d23-4e9d-90d9-c26582dd918a. {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11037}} Aug 30 14:09:54.107299 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-0087d185-38d7-403f-b743-b29f9e83aedf req-86d53923-92da-493c-b14b-4877fed904e4 service nova] Acquiring lock "refresh_cache-cd08791f-846f-4204-97a6-eb1701ed0723" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:09:54.107299 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-0087d185-38d7-403f-b743-b29f9e83aedf req-86d53923-92da-493c-b14b-4877fed904e4 service nova] Acquired lock "refresh_cache-cd08791f-846f-4204-97a6-eb1701ed0723" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:09:54.107990 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-0087d185-38d7-403f-b743-b29f9e83aedf req-86d53923-92da-493c-b14b-4877fed904e4 service nova] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Refreshing network info cache for port fa0425cc-6d23-4e9d-90d9-c26582dd918a {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} Aug 30 14:09:55.423245 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:55.624091 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:55.724379 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-0087d185-38d7-403f-b743-b29f9e83aedf req-86d53923-92da-493c-b14b-4877fed904e4 service nova] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Updated VIF entry in instance network info cache for port fa0425cc-6d23-4e9d-90d9-c26582dd918a. {{(pid=107505) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} Aug 30 14:09:55.724928 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-0087d185-38d7-403f-b743-b29f9e83aedf req-86d53923-92da-493c-b14b-4877fed904e4 service nova] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Updating instance_info_cache with network_info: [{"id": "fa0425cc-6d23-4e9d-90d9-c26582dd918a", "address": "fa:16:3e:86:83:a5", "network": {"id": "33008139-2fd9-496e-ac17-91db64b6f4c6", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1161556951-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.5.10", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b35dffa3f90f40559bd146f98ed3083d", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa0425cc-6d", "ovs_interfaceid": "fa0425cc-6d23-4e9d-90d9-c26582dd918a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:09:55.804569 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-0087d185-38d7-403f-b743-b29f9e83aedf req-86d53923-92da-493c-b14b-4877fed904e4 service nova] Releasing lock "refresh_cache-cd08791f-846f-4204-97a6-eb1701ed0723" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:09:56.158301 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:09:56.158551 np0035104604 nova-compute[107505]: INFO nova.compute.manager [-] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] VM Stopped (Lifecycle Event) Aug 30 14:09:56.191332 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-6fa3511e-46a2-46ad-a2d3-63edfe23808a None None] [instance: b23c4b18-61e2-4fd1-b593-e7d9f7488f70] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:09:57.415020 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:09:58.699396 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:00.035336 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Acquiring lock "181b742c-8cfa-491a-abac-26a7ade35dcf" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:10:00.036276 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Lock "181b742c-8cfa-491a-abac-26a7ade35dcf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:10:00.057534 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Starting instance... {{(pid=107505) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} Aug 30 14:10:00.266890 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:10:00.267388 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:10:00.273264 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=107505) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Aug 30 14:10:00.273612 np0035104604 nova-compute[107505]: INFO nova.compute.claims [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Claim successful on node np0035104604 Aug 30 14:10:00.683432 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:00.847069 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:10:01.498787 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.651s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:10:01.505000 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:10:01.521335 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:10:01.566415 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.299s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:10:01.567500 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Start building networks asynchronously for instance. {{(pid=107505) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} Aug 30 14:10:01.633761 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Allocating IP information in the background. {{(pid=107505) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} Aug 30 14:10:01.634413 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] allocate_for_instance() {{(pid=107505) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} Aug 30 14:10:01.756246 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Aug 30 14:10:01.776699 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Start building block device mappings for instance. {{(pid=107505) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} Aug 30 14:10:01.867105 np0035104604 nova-compute[107505]: DEBUG nova.policy [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3903524595c2417b9e0f6c77c97b986c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f6283d5b3a59498c8e862844ef48e9f0', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=107505) authorize /opt/stack/nova/nova/policy.py:203}} Aug 30 14:10:01.984139 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Start spawning the instance on the hypervisor. {{(pid=107505) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} Aug 30 14:10:01.985265 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Creating instance directory {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4717}} Aug 30 14:10:01.985768 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Creating image(s) Aug 30 14:10:02.020064 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] rbd image 181b742c-8cfa-491a-abac-26a7ade35dcf_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:10:02.051608 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] rbd image 181b742c-8cfa-491a-abac-26a7ade35dcf_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:10:02.092234 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] rbd image 181b742c-8cfa-491a-abac-26a7ade35dcf_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:10:02.095575 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e --force-share --output=json {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:10:02.189981 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] CMD "/opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e --force-share --output=json" returned: 0 in 0.093s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:10:02.191464 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Acquiring lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:10:02.192776 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.002s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:10:02.193775 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:10:02.242304 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] rbd image 181b742c-8cfa-491a-abac-26a7ade35dcf_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:10:02.247209 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e 181b742c-8cfa-491a-abac-26a7ade35dcf_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:10:02.560767 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] CMD "rbd import --pool vms /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e 181b742c-8cfa-491a-abac-26a7ade35dcf_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.313s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:10:02.644912 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] resizing rbd image 181b742c-8cfa-491a-abac-26a7ade35dcf_disk to 1073741824 {{(pid=107505) resize /opt/stack/nova/nova/storage/rbd_utils.py:288}} Aug 30 14:10:02.787140 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Created local disks {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4851}} Aug 30 14:10:02.788223 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Ensure instance console log exists: /opt/stack/data/nova/instances/181b742c-8cfa-491a-abac-26a7ade35dcf/console.log {{(pid=107505) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4603}} Aug 30 14:10:02.789037 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:10:02.789605 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:10:02.790089 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:10:02.826938 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Successfully created port: 4c8c467b-1126-49ec-9cd3-648ecd61b8c7 {{(pid=107505) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} Aug 30 14:10:03.599074 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Successfully updated port: 4c8c467b-1126-49ec-9cd3-648ecd61b8c7 {{(pid=107505) _update_port /opt/stack/nova/nova/network/neutron.py:586}} Aug 30 14:10:03.615351 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Acquiring lock "refresh_cache-181b742c-8cfa-491a-abac-26a7ade35dcf" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:10:03.616072 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Acquired lock "refresh_cache-181b742c-8cfa-491a-abac-26a7ade35dcf" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:10:03.616489 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Building network info cache for instance {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} Aug 30 14:10:03.746855 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:03.750615 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-db08cbf2-8cb2-465d-89d3-194e7454abc1 req-09f29d40-140b-4f6d-8e4f-b7ffaee72d5b service nova] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Received event network-changed-4c8c467b-1126-49ec-9cd3-648ecd61b8c7 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:10:03.751089 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-db08cbf2-8cb2-465d-89d3-194e7454abc1 req-09f29d40-140b-4f6d-8e4f-b7ffaee72d5b service nova] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Refreshing instance network info cache due to event network-changed-4c8c467b-1126-49ec-9cd3-648ecd61b8c7. {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11037}} Aug 30 14:10:03.751516 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-db08cbf2-8cb2-465d-89d3-194e7454abc1 req-09f29d40-140b-4f6d-8e4f-b7ffaee72d5b service nova] Acquiring lock "refresh_cache-181b742c-8cfa-491a-abac-26a7ade35dcf" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:10:03.820752 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Instance cache missing network info. {{(pid=107505) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} Aug 30 14:10:04.799730 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Updating instance_info_cache with network_info: [{"id": "4c8c467b-1126-49ec-9cd3-648ecd61b8c7", "address": "fa:16:3e:a8:a9:b4", "network": {"id": "266b543e-1ede-4388-a169-60bf5491ca59", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-2139002279-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "f6283d5b3a59498c8e862844ef48e9f0", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c8c467b-11", "ovs_interfaceid": "4c8c467b-1126-49ec-9cd3-648ecd61b8c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:10:04.818809 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Releasing lock "refresh_cache-181b742c-8cfa-491a-abac-26a7ade35dcf" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:10:04.819556 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Instance network_info: |[{"id": "4c8c467b-1126-49ec-9cd3-648ecd61b8c7", "address": "fa:16:3e:a8:a9:b4", "network": {"id": "266b543e-1ede-4388-a169-60bf5491ca59", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-2139002279-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "f6283d5b3a59498c8e862844ef48e9f0", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c8c467b-11", "ovs_interfaceid": "4c8c467b-1126-49ec-9cd3-648ecd61b8c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=107505) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} Aug 30 14:10:04.820190 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-db08cbf2-8cb2-465d-89d3-194e7454abc1 req-09f29d40-140b-4f6d-8e4f-b7ffaee72d5b service nova] Acquired lock "refresh_cache-181b742c-8cfa-491a-abac-26a7ade35dcf" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:10:04.820617 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-db08cbf2-8cb2-465d-89d3-194e7454abc1 req-09f29d40-140b-4f6d-8e4f-b7ffaee72d5b service nova] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Refreshing network info cache for port 4c8c467b-1126-49ec-9cd3-648ecd61b8c7 {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} Aug 30 14:10:04.826786 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Start _get_guest_xml network_info=[{"id": "4c8c467b-1126-49ec-9cd3-648ecd61b8c7", "address": "fa:16:3e:a8:a9:b4", "network": {"id": "266b543e-1ede-4388-a169-60bf5491ca59", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-2139002279-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "f6283d5b3a59498c8e862844ef48e9f0", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c8c467b-11", "ovs_interfaceid": "4c8c467b-1126-49ec-9cd3-648ecd61b8c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'ide', 'dev': 'hda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-08-30T14:01:32Z,direct_url=,disk_format='qcow2',id=a32d892d-d3ba-4b7e-9bab-7e06f730b9e0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='7f1ed4be4a61468fa9c662d2b5fbcb56',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-08-30T14:01:43Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'disk_bus': 'virtio', 'image_id': 'a32d892d-d3ba-4b7e-9bab-7e06f730b9e0'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=107505) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7553}} Aug 30 14:10:04.831896 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Aug 30 14:10:04.978311 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Searching host: 'np0035104604' for CPU controller through CGroups V1... {{(pid=107505) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1650}} Aug 30 14:10:04.979038 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] CPU controller missing on host. {{(pid=107505) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1660}} Aug 30 14:10:04.985469 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Searching host: 'np0035104604' for CPU controller through CGroups V2... {{(pid=107505) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1669}} Aug 30 14:10:04.985940 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] CPU controller found on host. {{(pid=107505) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1676}} Aug 30 14:10:04.987422 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=107505) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5390}} Aug 30 14:10:04.988109 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-08-30T14:02:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-08-30T14:01:32Z,direct_url=,disk_format='qcow2',id=a32d892d-d3ba-4b7e-9bab-7e06f730b9e0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='7f1ed4be4a61468fa9c662d2b5fbcb56',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-08-30T14:01:43Z,virtual_size=,visibility=), allow threads: True {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Aug 30 14:10:04.988477 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Flavor limits 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Aug 30 14:10:04.988789 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Image limits 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Aug 30 14:10:04.989138 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Flavor pref 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Aug 30 14:10:04.989452 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Image pref 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Aug 30 14:10:04.989776 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Aug 30 14:10:04.990163 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Aug 30 14:10:04.990509 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=107505) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Aug 30 14:10:04.990841 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Got 1 possible topologies {{(pid=107505) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Aug 30 14:10:04.991204 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Aug 30 14:10:04.991556 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Aug 30 14:10:05.006052 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:10:05.550241 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:10:05.550397 np0035104604 nova-compute[107505]: INFO nova.compute.manager [-] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] VM Stopped (Lifecycle Event) Aug 30 14:10:05.668690 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-0881d227-1f56-4cc4-9656-a34f64d42f5a None None] [instance: 31c8d191-4fe4-444a-9e36-e7506e8ebc76] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:10:05.684160 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:05.686051 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.680s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:10:05.715791 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] rbd image 181b742c-8cfa-491a-abac-26a7ade35dcf_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:10:05.732849 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:10:06.226223 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-db08cbf2-8cb2-465d-89d3-194e7454abc1 req-09f29d40-140b-4f6d-8e4f-b7ffaee72d5b service nova] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Updated VIF entry in instance network info cache for port 4c8c467b-1126-49ec-9cd3-648ecd61b8c7. {{(pid=107505) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} Aug 30 14:10:06.226961 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-db08cbf2-8cb2-465d-89d3-194e7454abc1 req-09f29d40-140b-4f6d-8e4f-b7ffaee72d5b service nova] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Updating instance_info_cache with network_info: [{"id": "4c8c467b-1126-49ec-9cd3-648ecd61b8c7", "address": "fa:16:3e:a8:a9:b4", "network": {"id": "266b543e-1ede-4388-a169-60bf5491ca59", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-2139002279-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "f6283d5b3a59498c8e862844ef48e9f0", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c8c467b-11", "ovs_interfaceid": "4c8c467b-1126-49ec-9cd3-648ecd61b8c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:10:06.240545 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-db08cbf2-8cb2-465d-89d3-194e7454abc1 req-09f29d40-140b-4f6d-8e4f-b7ffaee72d5b service nova] Releasing lock "refresh_cache-181b742c-8cfa-491a-abac-26a7ade35dcf" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:10:06.321861 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Acquiring lock "10be0cac-53da-4f6f-acba-455135d2b5be" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:10:06.322494 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Lock "10be0cac-53da-4f6f-acba-455135d2b5be" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:10:06.343857 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Starting instance... {{(pid=107505) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} Aug 30 14:10:06.561422 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.829s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:10:06.562800 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-08-30T14:09:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-868222137',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-attachinterfacesv270test-server-868222137',id=14,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f6283d5b3a59498c8e862844ef48e9f0',ramdisk_id='',reservation_id='r-o03shz4h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachInterfacesV270Test-2030737557',owner_user_name='tempest-AttachInterfacesV270Test-2030737557-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-08-30T14:10:02Z,user_data=None,user_id='3903524595c2417b9e0f6c77c97b986c',uuid=181b742c-8cfa-491a-abac-26a7ade35dcf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4c8c467b-1126-49ec-9cd3-648ecd61b8c7", "address": "fa:16:3e:a8:a9:b4", "network": {"id": "266b543e-1ede-4388-a169-60bf5491ca59", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-2139002279-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "f6283d5b3a59498c8e862844ef48e9f0", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c8c467b-11", "ovs_interfaceid": "4c8c467b-1126-49ec-9cd3-648ecd61b8c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=qemu {{(pid=107505) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Aug 30 14:10:06.563134 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Converting VIF {"id": "4c8c467b-1126-49ec-9cd3-648ecd61b8c7", "address": "fa:16:3e:a8:a9:b4", "network": {"id": "266b543e-1ede-4388-a169-60bf5491ca59", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-2139002279-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "f6283d5b3a59498c8e862844ef48e9f0", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c8c467b-11", "ovs_interfaceid": "4c8c467b-1126-49ec-9cd3-648ecd61b8c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:10:06.564063 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a8:a9:b4,bridge_name='br-int',has_traffic_filtering=True,id=4c8c467b-1126-49ec-9cd3-648ecd61b8c7,network=Network(266b543e-1ede-4388-a169-60bf5491ca59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c8c467b-11') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:10:06.565055 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Lazy-loading 'pci_devices' on Instance uuid 181b742c-8cfa-491a-abac-26a7ade35dcf {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] End _get_guest_xml xml= Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: 181b742c-8cfa-491a-abac-26a7ade35dcf Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: instance-0000000e Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: 131072 Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: 1 Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: tempest-AttachInterfacesV270Test-server-868222137 Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: 2023-08-30 14:10:04 Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: 128 Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: 1 Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: 0 Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: 0 Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: 1 Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: tempest-AttachInterfacesV270Test-2030737557-project-member Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: tempest-AttachInterfacesV270Test-2030737557 Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: OpenStack Foundation Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: OpenStack Nova Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: 27.1.0 Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: 181b742c-8cfa-491a-abac-26a7ade35dcf Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: 181b742c-8cfa-491a-abac-26a7ade35dcf Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Virtual Machine Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: hvm Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Nehalem Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: /dev/urandom Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: Aug 30 14:10:06.581030 np0035104604 nova-compute[107505]: {{(pid=107505) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7559}} Aug 30 14:10:06.588030 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Preparing to wait for external event network-vif-plugged-4c8c467b-1126-49ec-9cd3-648ecd61b8c7 {{(pid=107505) prepare_for_instance_event /opt/stack/nova/nova/compute/manager.py:283}} Aug 30 14:10:06.588030 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Acquiring lock "181b742c-8cfa-491a-abac-26a7ade35dcf-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:10:06.588030 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Lock "181b742c-8cfa-491a-abac-26a7ade35dcf-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:10:06.588030 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Lock "181b742c-8cfa-491a-abac-26a7ade35dcf-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:10:06.588030 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-08-30T14:09:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-868222137',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-attachinterfacesv270test-server-868222137',id=14,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f6283d5b3a59498c8e862844ef48e9f0',ramdisk_id='',reservation_id='r-o03shz4h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachInterfacesV270Test-2030737557',owner_user_name='tempest-AttachInterfacesV270Test-2030737557-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-08-30T14:10:02Z,user_data=None,user_id='3903524595c2417b9e0f6c77c97b986c',uuid=181b742c-8cfa-491a-abac-26a7ade35dcf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4c8c467b-1126-49ec-9cd3-648ecd61b8c7", "address": "fa:16:3e:a8:a9:b4", "network": {"id": "266b543e-1ede-4388-a169-60bf5491ca59", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-2139002279-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "f6283d5b3a59498c8e862844ef48e9f0", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c8c467b-11", "ovs_interfaceid": "4c8c467b-1126-49ec-9cd3-648ecd61b8c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Aug 30 14:10:06.588540 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Converting VIF {"id": "4c8c467b-1126-49ec-9cd3-648ecd61b8c7", "address": "fa:16:3e:a8:a9:b4", "network": {"id": "266b543e-1ede-4388-a169-60bf5491ca59", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-2139002279-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "f6283d5b3a59498c8e862844ef48e9f0", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c8c467b-11", "ovs_interfaceid": "4c8c467b-1126-49ec-9cd3-648ecd61b8c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:10:06.588540 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a8:a9:b4,bridge_name='br-int',has_traffic_filtering=True,id=4c8c467b-1126-49ec-9cd3-648ecd61b8c7,network=Network(266b543e-1ede-4388-a169-60bf5491ca59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c8c467b-11') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:10:06.588540 np0035104604 nova-compute[107505]: DEBUG os_vif [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:a9:b4,bridge_name='br-int',has_traffic_filtering=True,id=4c8c467b-1126-49ec-9cd3-648ecd61b8c7,network=Network(266b543e-1ede-4388-a169-60bf5491ca59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c8c467b-11') {{(pid=107505) plug /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:76}} Aug 30 14:10:06.588540 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:06.588540 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:10:06.588540 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Aug 30 14:10:06.590064 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:10:06.590529 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:10:06.594971 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:06.595502 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4c8c467b-11, may_exist=True, interface_attrs={}) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:10:06.596416 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4c8c467b-11, col_values=(('external_ids', {'iface-id': '4c8c467b-1126-49ec-9cd3-648ecd61b8c7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a8:a9:b4', 'vm-uuid': '181b742c-8cfa-491a-abac-26a7ade35dcf'}),), if_exists=True) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:10:06.598585 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:06.607028 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:06.614410 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=107505) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Aug 30 14:10:06.614863 np0035104604 nova-compute[107505]: INFO nova.compute.claims [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Claim successful on node np0035104604 Aug 30 14:10:06.620982 np0035104604 nova-compute[107505]: INFO os_vif [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:a9:b4,bridge_name='br-int',has_traffic_filtering=True,id=4c8c467b-1126-49ec-9cd3-648ecd61b8c7,network=Network(266b543e-1ede-4388-a169-60bf5491ca59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c8c467b-11') Aug 30 14:10:06.670998 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] No BDM found with device name vda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:10:06.671252 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] No BDM found with device name hda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:10:06.671603 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] No VIF found with MAC fa:16:3e:a8:a9:b4, not building metadata {{(pid=107505) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12068}} Aug 30 14:10:06.672537 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Using config drive Aug 30 14:10:06.703207 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] rbd image 181b742c-8cfa-491a-abac-26a7ade35dcf_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:10:07.418649 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Creating config drive at /opt/stack/data/nova/instances/181b742c-8cfa-491a-abac-26a7ade35dcf/disk.config Aug 30 14:10:07.424824 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Running cmd (subprocess): genisoimage -o /opt/stack/data/nova/instances/181b742c-8cfa-491a-abac-26a7ade35dcf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 27.1.0 -quiet -J -r -V config-2 /tmp/tmpszrcouw_ {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:10:07.459707 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] CMD "genisoimage -o /opt/stack/data/nova/instances/181b742c-8cfa-491a-abac-26a7ade35dcf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 27.1.0 -quiet -J -r -V config-2 /tmp/tmpszrcouw_" returned: 0 in 0.034s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:10:07.486386 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] rbd image 181b742c-8cfa-491a-abac-26a7ade35dcf_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:10:07.489869 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/181b742c-8cfa-491a-abac-26a7ade35dcf/disk.config 181b742c-8cfa-491a-abac-26a7ade35dcf_disk.config --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:10:07.649361 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] CMD "rbd import --pool vms /opt/stack/data/nova/instances/181b742c-8cfa-491a-abac-26a7ade35dcf/disk.config 181b742c-8cfa-491a-abac-26a7ade35dcf_disk.config --image-format=2 --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.159s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:10:07.650099 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Deleting local config drive /opt/stack/data/nova/instances/181b742c-8cfa-491a-abac-26a7ade35dcf/disk.config because it was imported into RBD. Aug 30 14:10:07.668155 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:07.687904 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:10:07.726513 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:07.918805 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-e59ca163-55a4-47d3-a7d4-8d99bb9b3bd9 req-b060fb78-b422-4f6c-935e-8873340982e7 service nova] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Received event network-vif-plugged-4c8c467b-1126-49ec-9cd3-648ecd61b8c7 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:10:07.918805 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-e59ca163-55a4-47d3-a7d4-8d99bb9b3bd9 req-b060fb78-b422-4f6c-935e-8873340982e7 service nova] Acquiring lock "181b742c-8cfa-491a-abac-26a7ade35dcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:10:07.918805 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-e59ca163-55a4-47d3-a7d4-8d99bb9b3bd9 req-b060fb78-b422-4f6c-935e-8873340982e7 service nova] Lock "181b742c-8cfa-491a-abac-26a7ade35dcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:10:07.919281 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-e59ca163-55a4-47d3-a7d4-8d99bb9b3bd9 req-b060fb78-b422-4f6c-935e-8873340982e7 service nova] Lock "181b742c-8cfa-491a-abac-26a7ade35dcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:10:07.919426 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-e59ca163-55a4-47d3-a7d4-8d99bb9b3bd9 req-b060fb78-b422-4f6c-935e-8873340982e7 service nova] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Processing event network-vif-plugged-4c8c467b-1126-49ec-9cd3-648ecd61b8c7 {{(pid=107505) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10792}} Aug 30 14:10:08.129225 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:08.148931 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:08.164622 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:08.567918 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.880s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:10:08.575779 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:10:08.596852 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:10:08.632279 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.042s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:10:08.632877 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Start building networks asynchronously for instance. {{(pid=107505) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} Aug 30 14:10:08.698837 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Allocating IP information in the background. {{(pid=107505) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} Aug 30 14:10:08.699184 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] allocate_for_instance() {{(pid=107505) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} Aug 30 14:10:08.837370 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:08.845111 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Started> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:10:08.845689 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] VM Started (Lifecycle Event) Aug 30 14:10:08.848364 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Aug 30 14:10:08.851680 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Instance event wait completed in 0 seconds for network-vif-plugged {{(pid=107505) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Aug 30 14:10:08.863592 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Guest created on hypervisor {{(pid=107505) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4411}} Aug 30 14:10:08.868066 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Start building block device mappings for instance. {{(pid=107505) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} Aug 30 14:10:08.874072 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:10:08.999352 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Instance spawned successfully. Aug 30 14:10:09.000175 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:894}} Aug 30 14:10:09.006402 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:10:09.025081 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Found default for hw_cdrom_bus of ide {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:10:09.026770 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Found default for hw_disk_bus of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:10:09.028155 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Found default for hw_input_bus of None {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:10:09.031141 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Found default for hw_pointer_model of None {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:10:09.031141 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Found default for hw_video_model of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:10:09.031837 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Found default for hw_vif_model of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:10:09.037509 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:10:09.037934 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Starting heal instance info cache {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9842}} Aug 30 14:10:09.066852 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] During sync_power_state the instance has a pending task (spawning). Skip. Aug 30 14:10:09.067287 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Paused> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:10:09.067600 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] VM Paused (Lifecycle Event) Aug 30 14:10:09.073007 np0035104604 nova-compute[107505]: DEBUG nova.policy [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '68051cbd26d345158739406b63f39f73', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b4618fbd680046438e24accb6411f011', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=107505) authorize /opt/stack/nova/nova/policy.py:203}} Aug 30 14:10:09.225739 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Didn't find any instances for network info cache update. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9928}} Aug 30 14:10:09.252623 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Start spawning the instance on the hypervisor. {{(pid=107505) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} Aug 30 14:10:09.252623 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Creating instance directory {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4717}} Aug 30 14:10:09.252623 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Creating image(s) Aug 30 14:10:09.256270 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] rbd image 10be0cac-53da-4f6f-acba-455135d2b5be_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:10:09.289944 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] rbd image 10be0cac-53da-4f6f-acba-455135d2b5be_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:10:09.324688 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] rbd image 10be0cac-53da-4f6f-acba-455135d2b5be_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:10:09.328783 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e --force-share --output=json {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:10:09.351287 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:10:09.355635 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Took 7.37 seconds to spawn the instance on the hypervisor. Aug 30 14:10:09.356271 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:10:09.363569 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager.update_available_resource {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:10:09.368995 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Resumed> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:10:09.369371 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] VM Resumed (Lifecycle Event) Aug 30 14:10:09.394668 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:10:09.409992 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:10:09.410541 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:10:09.410973 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:10:09.411397 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Auditing locally available compute resources for np0035104604 (node: np0035104604) {{(pid=107505) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} Aug 30 14:10:09.412061 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:10:09.437592 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:10:09.443829 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] CMD "/opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e --force-share --output=json" returned: 0 in 0.115s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:10:09.444610 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Acquiring lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:10:09.445335 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:10:09.445720 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:10:09.508471 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] rbd image 10be0cac-53da-4f6f-acba-455135d2b5be_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:10:09.512236 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e 10be0cac-53da-4f6f-acba-455135d2b5be_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:10:09.538380 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] During sync_power_state the instance has a pending task (spawning). Skip. Aug 30 14:10:09.547983 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Took 9.43 seconds to build instance. Aug 30 14:10:09.566110 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-dd99c3c2-6a8a-4ca7-ad5f-d0a39c5eafeb tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Lock "181b742c-8cfa-491a-abac-26a7ade35dcf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 9.530s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:10:09.879091 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] CMD "rbd import --pool vms /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e 10be0cac-53da-4f6f-acba-455135d2b5be_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.367s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:10:09.960686 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] resizing rbd image 10be0cac-53da-4f6f-acba-455135d2b5be_disk to 1073741824 {{(pid=107505) resize /opt/stack/nova/nova/storage/rbd_utils.py:288}} Aug 30 14:10:10.106271 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-eb3e6807-e40f-43cd-b1b8-93f14ed96434 req-b2e7c95f-a200-4c91-a6e2-13af7460e912 service nova] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Received event network-vif-plugged-4c8c467b-1126-49ec-9cd3-648ecd61b8c7 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:10:10.106271 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-eb3e6807-e40f-43cd-b1b8-93f14ed96434 req-b2e7c95f-a200-4c91-a6e2-13af7460e912 service nova] Acquiring lock "181b742c-8cfa-491a-abac-26a7ade35dcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:10:10.106271 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-eb3e6807-e40f-43cd-b1b8-93f14ed96434 req-b2e7c95f-a200-4c91-a6e2-13af7460e912 service nova] Lock "181b742c-8cfa-491a-abac-26a7ade35dcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:10:10.106781 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-eb3e6807-e40f-43cd-b1b8-93f14ed96434 req-b2e7c95f-a200-4c91-a6e2-13af7460e912 service nova] Lock "181b742c-8cfa-491a-abac-26a7ade35dcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:10:10.107073 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-eb3e6807-e40f-43cd-b1b8-93f14ed96434 req-b2e7c95f-a200-4c91-a6e2-13af7460e912 service nova] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] No waiting events found dispatching network-vif-plugged-4c8c467b-1126-49ec-9cd3-648ecd61b8c7 {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:10:10.107421 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [req-eb3e6807-e40f-43cd-b1b8-93f14ed96434 req-b2e7c95f-a200-4c91-a6e2-13af7460e912 service nova] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Received unexpected event network-vif-plugged-4c8c467b-1126-49ec-9cd3-648ecd61b8c7 for instance with vm_state active and task_state None. Aug 30 14:10:10.293147 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Created local disks {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4851}} Aug 30 14:10:10.293775 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Ensure instance console log exists: /opt/stack/data/nova/instances/10be0cac-53da-4f6f-acba-455135d2b5be/console.log {{(pid=107505) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4603}} Aug 30 14:10:10.294826 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:10:10.295519 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:10:10.296325 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:10:10.300216 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Successfully created port: bcef4154-0fd0-485b-8249-0c3a3fb4e45f {{(pid=107505) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} Aug 30 14:10:10.467553 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 1.055s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:10:10.554823 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-0000000d as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:10:10.555090 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-0000000d as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:10:10.555329 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-0000000d as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:10:10.559915 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-0000000e as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:10:10.560370 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-0000000e as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:10:10.650236 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Aug 30 14:10:10.652809 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Hypervisor/Node resource view: name=np0035104604 free_ram=1571MB free_disk=29.95368194580078GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "00b8", "vendor_id": "1013", "numa_node": null, "label": "label_1013_00b8", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] {{(pid=107505) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} Aug 30 14:10:10.653576 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:10:10.654085 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:10:10.897654 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance cd08791f-846f-4204-97a6-eb1701ed0723 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 2}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:10:10.898362 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance 181b742c-8cfa-491a-abac-26a7ade35dcf actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:10:10.898362 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance 10be0cac-53da-4f6f-acba-455135d2b5be actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:10:10.904402 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Total usable vcpus: 8, total allocated vcpus: 3 {{(pid=107505) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} Aug 30 14:10:10.908585 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Final resource view: name=np0035104604 phys_ram=7746MB used_ram=896MB phys_disk=29GB used_disk=4GB total_vcpus=8 used_vcpus=3 pci_stats=[] {{(pid=107505) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} Aug 30 14:10:11.346463 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Successfully updated port: bcef4154-0fd0-485b-8249-0c3a3fb4e45f {{(pid=107505) _update_port /opt/stack/nova/nova/network/neutron.py:586}} Aug 30 14:10:11.363894 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Acquiring lock "refresh_cache-10be0cac-53da-4f6f-acba-455135d2b5be" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:10:11.364535 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Acquired lock "refresh_cache-10be0cac-53da-4f6f-acba-455135d2b5be" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:10:11.365123 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Building network info cache for instance {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} Aug 30 14:10:11.370252 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-0c7dc7f8-aa03-4425-85fd-ea63315458b9 tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Acquiring lock "interface-181b742c-8cfa-491a-abac-26a7ade35dcf-None" by "nova.compute.manager.ComputeManager.attach_interface..do_attach_interface" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:10:11.370865 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-0c7dc7f8-aa03-4425-85fd-ea63315458b9 tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Lock "interface-181b742c-8cfa-491a-abac-26a7ade35dcf-None" acquired by "nova.compute.manager.ComputeManager.attach_interface..do_attach_interface" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:10:11.371683 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-0c7dc7f8-aa03-4425-85fd-ea63315458b9 tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Lazy-loading 'flavor' on Instance uuid 181b742c-8cfa-491a-abac-26a7ade35dcf {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:10:11.495677 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-0c7dc7f8-aa03-4425-85fd-ea63315458b9 tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Lazy-loading 'pci_requests' on Instance uuid 181b742c-8cfa-491a-abac-26a7ade35dcf {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:10:11.609065 np0035104604 nova-compute[107505]: DEBUG nova.objects.base [None req-0c7dc7f8-aa03-4425-85fd-ea63315458b9 tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Object Instance<181b742c-8cfa-491a-abac-26a7ade35dcf> lazy-loaded attributes: flavor,pci_requests {{(pid=107505) wrapper /opt/stack/nova/nova/objects/base.py:126}} Aug 30 14:10:11.609409 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-0c7dc7f8-aa03-4425-85fd-ea63315458b9 tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] allocate_for_instance() {{(pid=107505) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} Aug 30 14:10:11.612820 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Instance cache missing network info. {{(pid=107505) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} Aug 30 14:10:11.615320 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:11.835341 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:10:12.149257 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-2c54e719-2bfd-4498-8825-f68c954e675d req-5e48f8cd-d2a6-4467-9288-d7c8ac27cd22 service nova] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Received event network-changed-bcef4154-0fd0-485b-8249-0c3a3fb4e45f {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:10:12.149772 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-2c54e719-2bfd-4498-8825-f68c954e675d req-5e48f8cd-d2a6-4467-9288-d7c8ac27cd22 service nova] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Refreshing instance network info cache due to event network-changed-bcef4154-0fd0-485b-8249-0c3a3fb4e45f. {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11037}} Aug 30 14:10:12.149915 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-2c54e719-2bfd-4498-8825-f68c954e675d req-5e48f8cd-d2a6-4467-9288-d7c8ac27cd22 service nova] Acquiring lock "refresh_cache-10be0cac-53da-4f6f-acba-455135d2b5be" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:10:12.324187 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-0c7dc7f8-aa03-4425-85fd-ea63315458b9 tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Lock "interface-181b742c-8cfa-491a-abac-26a7ade35dcf-None" "released" by "nova.compute.manager.ComputeManager.attach_interface..do_attach_interface" :: held 0.953s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:10:12.698778 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.864s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:10:12.707245 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:10:12.713920 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-8a9eaeae-8e0a-4366-bf2b-2412dc1d2890 tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Acquiring lock "181b742c-8cfa-491a-abac-26a7ade35dcf" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:10:12.714128 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-8a9eaeae-8e0a-4366-bf2b-2412dc1d2890 tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Lock "181b742c-8cfa-491a-abac-26a7ade35dcf" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:10:12.714464 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-8a9eaeae-8e0a-4366-bf2b-2412dc1d2890 tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Acquiring lock "181b742c-8cfa-491a-abac-26a7ade35dcf-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:10:12.714831 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-8a9eaeae-8e0a-4366-bf2b-2412dc1d2890 tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Lock "181b742c-8cfa-491a-abac-26a7ade35dcf-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:10:12.715086 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-8a9eaeae-8e0a-4366-bf2b-2412dc1d2890 tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Lock "181b742c-8cfa-491a-abac-26a7ade35dcf-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:10:12.717954 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-8a9eaeae-8e0a-4366-bf2b-2412dc1d2890 tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Terminating instance Aug 30 14:10:12.720327 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-8a9eaeae-8e0a-4366-bf2b-2412dc1d2890 tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Start destroying the instance on the hypervisor. {{(pid=107505) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} Aug 30 14:10:12.724670 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:10:12.760920 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Compute_service record updated for np0035104604:np0035104604 {{(pid=107505) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} Aug 30 14:10:12.761406 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.107s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:10:12.780566 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Updating instance_info_cache with network_info: [{"id": "bcef4154-0fd0-485b-8249-0c3a3fb4e45f", "address": "fa:16:3e:00:56:50", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.61", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbcef4154-0f", "ovs_interfaceid": "bcef4154-0fd0-485b-8249-0c3a3fb4e45f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:10:12.803324 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Releasing lock "refresh_cache-10be0cac-53da-4f6f-acba-455135d2b5be" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:10:12.803943 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Instance network_info: |[{"id": "bcef4154-0fd0-485b-8249-0c3a3fb4e45f", "address": "fa:16:3e:00:56:50", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.61", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbcef4154-0f", "ovs_interfaceid": "bcef4154-0fd0-485b-8249-0c3a3fb4e45f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=107505) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} Aug 30 14:10:12.806234 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:12.810983 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-2c54e719-2bfd-4498-8825-f68c954e675d req-5e48f8cd-d2a6-4467-9288-d7c8ac27cd22 service nova] Acquired lock "refresh_cache-10be0cac-53da-4f6f-acba-455135d2b5be" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:10:12.811500 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-2c54e719-2bfd-4498-8825-f68c954e675d req-5e48f8cd-d2a6-4467-9288-d7c8ac27cd22 service nova] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Refreshing network info cache for port bcef4154-0fd0-485b-8249-0c3a3fb4e45f {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} Aug 30 14:10:12.822685 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Start _get_guest_xml network_info=[{"id": "bcef4154-0fd0-485b-8249-0c3a3fb4e45f", "address": "fa:16:3e:00:56:50", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.61", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbcef4154-0f", "ovs_interfaceid": "bcef4154-0fd0-485b-8249-0c3a3fb4e45f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'ide', 'dev': 'hda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-08-30T14:01:32Z,direct_url=,disk_format='qcow2',id=a32d892d-d3ba-4b7e-9bab-7e06f730b9e0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='7f1ed4be4a61468fa9c662d2b5fbcb56',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-08-30T14:01:43Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'disk_bus': 'virtio', 'image_id': 'a32d892d-d3ba-4b7e-9bab-7e06f730b9e0'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=107505) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7553}} Aug 30 14:10:12.823768 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:12.834010 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Aug 30 14:10:12.839275 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:12.844779 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Searching host: 'np0035104604' for CPU controller through CGroups V1... {{(pid=107505) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1650}} Aug 30 14:10:12.845605 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] CPU controller missing on host. {{(pid=107505) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1660}} Aug 30 14:10:12.847729 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Searching host: 'np0035104604' for CPU controller through CGroups V2... {{(pid=107505) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1669}} Aug 30 14:10:12.848241 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] CPU controller found on host. {{(pid=107505) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1676}} Aug 30 14:10:12.850410 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=107505) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5390}} Aug 30 14:10:12.851258 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-08-30T14:02:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-08-30T14:01:32Z,direct_url=,disk_format='qcow2',id=a32d892d-d3ba-4b7e-9bab-7e06f730b9e0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='7f1ed4be4a61468fa9c662d2b5fbcb56',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-08-30T14:01:43Z,virtual_size=,visibility=), allow threads: True {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Aug 30 14:10:12.851748 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Flavor limits 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Aug 30 14:10:12.852006 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Image limits 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Aug 30 14:10:12.852377 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Flavor pref 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Aug 30 14:10:12.852696 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Image pref 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Aug 30 14:10:12.853007 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Aug 30 14:10:12.853466 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Aug 30 14:10:12.853827 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=107505) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Aug 30 14:10:12.854104 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Got 1 possible topologies {{(pid=107505) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Aug 30 14:10:12.854548 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Aug 30 14:10:12.855203 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Aug 30 14:10:12.879624 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:10:12.899287 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:13.055056 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Instance destroyed successfully. Aug 30 14:10:13.059434 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-8a9eaeae-8e0a-4366-bf2b-2412dc1d2890 tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Lazy-loading 'resources' on Instance uuid 181b742c-8cfa-491a-abac-26a7ade35dcf {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:10:13.073144 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-8a9eaeae-8e0a-4366-bf2b-2412dc1d2890 tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2023-08-30T14:09:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-868222137',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-attachinterfacesv270test-server-868222137',id=14,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-08-30T14:10:09Z,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='f6283d5b3a59498c8e862844ef48e9f0',ramdisk_id='',reservation_id='r-o03shz4h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachInterfacesV270Test-2030737557',owner_user_name='tempest-AttachInterfacesV270Test-2030737557-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-08-30T14:10:09Z,user_data=None,user_id='3903524595c2417b9e0f6c77c97b986c',uuid=181b742c-8cfa-491a-abac-26a7ade35dcf,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4c8c467b-1126-49ec-9cd3-648ecd61b8c7", "address": "fa:16:3e:a8:a9:b4", "network": {"id": "266b543e-1ede-4388-a169-60bf5491ca59", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-2139002279-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "f6283d5b3a59498c8e862844ef48e9f0", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c8c467b-11", "ovs_interfaceid": "4c8c467b-1126-49ec-9cd3-648ecd61b8c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Aug 30 14:10:13.073767 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-8a9eaeae-8e0a-4366-bf2b-2412dc1d2890 tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Converting VIF {"id": "4c8c467b-1126-49ec-9cd3-648ecd61b8c7", "address": "fa:16:3e:a8:a9:b4", "network": {"id": "266b543e-1ede-4388-a169-60bf5491ca59", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-2139002279-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "f6283d5b3a59498c8e862844ef48e9f0", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c8c467b-11", "ovs_interfaceid": "4c8c467b-1126-49ec-9cd3-648ecd61b8c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:10:13.075106 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-8a9eaeae-8e0a-4366-bf2b-2412dc1d2890 tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a8:a9:b4,bridge_name='br-int',has_traffic_filtering=True,id=4c8c467b-1126-49ec-9cd3-648ecd61b8c7,network=Network(266b543e-1ede-4388-a169-60bf5491ca59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c8c467b-11') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:10:13.075595 np0035104604 nova-compute[107505]: DEBUG os_vif [None req-8a9eaeae-8e0a-4366-bf2b-2412dc1d2890 tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:a9:b4,bridge_name='br-int',has_traffic_filtering=True,id=4c8c467b-1126-49ec-9cd3-648ecd61b8c7,network=Network(266b543e-1ede-4388-a169-60bf5491ca59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c8c467b-11') {{(pid=107505) unplug /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:109}} Aug 30 14:10:13.080423 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:13.081003 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4c8c467b-11, bridge=br-int, if_exists=True) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:10:13.082996 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:13.087769 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:10:13.091470 np0035104604 nova-compute[107505]: INFO os_vif [None req-8a9eaeae-8e0a-4366-bf2b-2412dc1d2890 tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:a9:b4,bridge_name='br-int',has_traffic_filtering=True,id=4c8c467b-1126-49ec-9cd3-648ecd61b8c7,network=Network(266b543e-1ede-4388-a169-60bf5491ca59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c8c467b-11') Aug 30 14:10:13.517048 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:10:13.517607 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:10:13.518061 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:10:13.522521 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-8a9eaeae-8e0a-4366-bf2b-2412dc1d2890 tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Deleting instance files /opt/stack/data/nova/instances/181b742c-8cfa-491a-abac-26a7ade35dcf_del Aug 30 14:10:13.524620 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-8a9eaeae-8e0a-4366-bf2b-2412dc1d2890 tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Deletion of /opt/stack/data/nova/instances/181b742c-8cfa-491a-abac-26a7ade35dcf_del complete Aug 30 14:10:13.534852 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:10:13.535422 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=107505) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10461}} Aug 30 14:10:13.708259 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:13.715451 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.836s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:10:13.757378 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] rbd image 10be0cac-53da-4f6f-acba-455135d2b5be_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:10:13.771077 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:10:13.965748 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-8a9eaeae-8e0a-4366-bf2b-2412dc1d2890 tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Took 1.24 seconds to destroy the instance on the hypervisor. Aug 30 14:10:13.966437 np0035104604 nova-compute[107505]: DEBUG oslo.service.loopingcall [None req-8a9eaeae-8e0a-4366-bf2b-2412dc1d2890 tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=107505) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} Aug 30 14:10:13.966908 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [-] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Deallocating network for instance {{(pid=107505) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} Aug 30 14:10:13.967130 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [-] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] deallocate_for_instance() {{(pid=107505) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} Aug 30 14:10:14.084816 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:10:14.227008 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-0c86cea5-4b38-4369-bcc3-d115ba2c9cb3 req-34995979-21ab-4dd6-b1e1-337ececf5f4f service nova] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Received event network-vif-unplugged-4c8c467b-1126-49ec-9cd3-648ecd61b8c7 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:10:14.227322 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-0c86cea5-4b38-4369-bcc3-d115ba2c9cb3 req-34995979-21ab-4dd6-b1e1-337ececf5f4f service nova] Acquiring lock "181b742c-8cfa-491a-abac-26a7ade35dcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:10:14.227643 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-0c86cea5-4b38-4369-bcc3-d115ba2c9cb3 req-34995979-21ab-4dd6-b1e1-337ececf5f4f service nova] Lock "181b742c-8cfa-491a-abac-26a7ade35dcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:10:14.227986 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-0c86cea5-4b38-4369-bcc3-d115ba2c9cb3 req-34995979-21ab-4dd6-b1e1-337ececf5f4f service nova] Lock "181b742c-8cfa-491a-abac-26a7ade35dcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:10:14.228319 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-0c86cea5-4b38-4369-bcc3-d115ba2c9cb3 req-34995979-21ab-4dd6-b1e1-337ececf5f4f service nova] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] No waiting events found dispatching network-vif-unplugged-4c8c467b-1126-49ec-9cd3-648ecd61b8c7 {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:10:14.228699 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-0c86cea5-4b38-4369-bcc3-d115ba2c9cb3 req-34995979-21ab-4dd6-b1e1-337ececf5f4f service nova] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Received event network-vif-unplugged-4c8c467b-1126-49ec-9cd3-648ecd61b8c7 for instance with task_state deleting. {{(pid=107505) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10810}} Aug 30 14:10:14.229070 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-0c86cea5-4b38-4369-bcc3-d115ba2c9cb3 req-34995979-21ab-4dd6-b1e1-337ececf5f4f service nova] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Received event network-vif-plugged-4c8c467b-1126-49ec-9cd3-648ecd61b8c7 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:10:14.229459 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-0c86cea5-4b38-4369-bcc3-d115ba2c9cb3 req-34995979-21ab-4dd6-b1e1-337ececf5f4f service nova] Acquiring lock "181b742c-8cfa-491a-abac-26a7ade35dcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:10:14.232897 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-0c86cea5-4b38-4369-bcc3-d115ba2c9cb3 req-34995979-21ab-4dd6-b1e1-337ececf5f4f service nova] Lock "181b742c-8cfa-491a-abac-26a7ade35dcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:10:14.232897 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-0c86cea5-4b38-4369-bcc3-d115ba2c9cb3 req-34995979-21ab-4dd6-b1e1-337ececf5f4f service nova] Lock "181b742c-8cfa-491a-abac-26a7ade35dcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:10:14.232897 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-0c86cea5-4b38-4369-bcc3-d115ba2c9cb3 req-34995979-21ab-4dd6-b1e1-337ececf5f4f service nova] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] No waiting events found dispatching network-vif-plugged-4c8c467b-1126-49ec-9cd3-648ecd61b8c7 {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:10:14.232897 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [req-0c86cea5-4b38-4369-bcc3-d115ba2c9cb3 req-34995979-21ab-4dd6-b1e1-337ececf5f4f service nova] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Received unexpected event network-vif-plugged-4c8c467b-1126-49ec-9cd3-648ecd61b8c7 for instance with vm_state active and task_state deleting. Aug 30 14:10:14.543372 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.772s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:10:14.545356 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-08-30T14:10:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TenantUsagesTestJSON-server-1488569924',display_name='tempest-TenantUsagesTestJSON-server-1488569924',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-tenantusagestestjson-server-1488569924',id=15,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b4618fbd680046438e24accb6411f011',ramdisk_id='',reservation_id='r-zldutu1y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-TenantUsagesTestJSON-1222411741',owner_user_name='tempest-TenantUsagesTestJSON-1222411741-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-08-30T14:10:09Z,user_data=None,user_id='68051cbd26d345158739406b63f39f73',uuid=10be0cac-53da-4f6f-acba-455135d2b5be,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bcef4154-0fd0-485b-8249-0c3a3fb4e45f", "address": "fa:16:3e:00:56:50", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.61", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbcef4154-0f", "ovs_interfaceid": "bcef4154-0fd0-485b-8249-0c3a3fb4e45f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=qemu {{(pid=107505) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Aug 30 14:10:14.546106 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Converting VIF {"id": "bcef4154-0fd0-485b-8249-0c3a3fb4e45f", "address": "fa:16:3e:00:56:50", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.61", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbcef4154-0f", "ovs_interfaceid": "bcef4154-0fd0-485b-8249-0c3a3fb4e45f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:10:14.546860 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:00:56:50,bridge_name='br-int',has_traffic_filtering=True,id=bcef4154-0fd0-485b-8249-0c3a3fb4e45f,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbcef4154-0f') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:10:14.548441 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Lazy-loading 'pci_devices' on Instance uuid 10be0cac-53da-4f6f-acba-455135d2b5be {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] End _get_guest_xml xml= Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: 10be0cac-53da-4f6f-acba-455135d2b5be Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: instance-0000000f Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: 131072 Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: 1 Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: tempest-TenantUsagesTestJSON-server-1488569924 Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: 2023-08-30 14:10:12 Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: 128 Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: 1 Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: 0 Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: 0 Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: 1 Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: tempest-TenantUsagesTestJSON-1222411741-project-member Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: tempest-TenantUsagesTestJSON-1222411741 Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: OpenStack Foundation Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: OpenStack Nova Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: 27.1.0 Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: 10be0cac-53da-4f6f-acba-455135d2b5be Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: 10be0cac-53da-4f6f-acba-455135d2b5be Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Virtual Machine Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: hvm Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Nehalem Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: /dev/urandom Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: Aug 30 14:10:14.561521 np0035104604 nova-compute[107505]: {{(pid=107505) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7559}} Aug 30 14:10:14.568218 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Preparing to wait for external event network-vif-plugged-bcef4154-0fd0-485b-8249-0c3a3fb4e45f {{(pid=107505) prepare_for_instance_event /opt/stack/nova/nova/compute/manager.py:283}} Aug 30 14:10:14.568218 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Acquiring lock "10be0cac-53da-4f6f-acba-455135d2b5be-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:10:14.568218 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Lock "10be0cac-53da-4f6f-acba-455135d2b5be-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:10:14.568218 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Lock "10be0cac-53da-4f6f-acba-455135d2b5be-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:10:14.568218 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-08-30T14:10:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TenantUsagesTestJSON-server-1488569924',display_name='tempest-TenantUsagesTestJSON-server-1488569924',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-tenantusagestestjson-server-1488569924',id=15,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b4618fbd680046438e24accb6411f011',ramdisk_id='',reservation_id='r-zldutu1y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-TenantUsagesTestJSON-1222411741',owner_user_name='tempest-TenantUsagesTestJSON-1222411741-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-08-30T14:10:09Z,user_data=None,user_id='68051cbd26d345158739406b63f39f73',uuid=10be0cac-53da-4f6f-acba-455135d2b5be,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bcef4154-0fd0-485b-8249-0c3a3fb4e45f", "address": "fa:16:3e:00:56:50", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.61", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbcef4154-0f", "ovs_interfaceid": "bcef4154-0fd0-485b-8249-0c3a3fb4e45f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Aug 30 14:10:14.568713 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Converting VIF {"id": "bcef4154-0fd0-485b-8249-0c3a3fb4e45f", "address": "fa:16:3e:00:56:50", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.61", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbcef4154-0f", "ovs_interfaceid": "bcef4154-0fd0-485b-8249-0c3a3fb4e45f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:10:14.568713 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:00:56:50,bridge_name='br-int',has_traffic_filtering=True,id=bcef4154-0fd0-485b-8249-0c3a3fb4e45f,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbcef4154-0f') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:10:14.568713 np0035104604 nova-compute[107505]: DEBUG os_vif [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:00:56:50,bridge_name='br-int',has_traffic_filtering=True,id=bcef4154-0fd0-485b-8249-0c3a3fb4e45f,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbcef4154-0f') {{(pid=107505) plug /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:76}} Aug 30 14:10:14.568713 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:14.568713 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:10:14.569264 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Aug 30 14:10:14.573216 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:14.573686 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbcef4154-0f, may_exist=True, interface_attrs={}) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:10:14.574701 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbcef4154-0f, col_values=(('external_ids', {'iface-id': 'bcef4154-0fd0-485b-8249-0c3a3fb4e45f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:00:56:50', 'vm-uuid': '10be0cac-53da-4f6f-acba-455135d2b5be'}),), if_exists=True) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:10:14.577213 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:14.583177 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:10:14.586088 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:14.588795 np0035104604 nova-compute[107505]: INFO os_vif [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:00:56:50,bridge_name='br-int',has_traffic_filtering=True,id=bcef4154-0fd0-485b-8249-0c3a3fb4e45f,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbcef4154-0f') Aug 30 14:10:14.633842 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] No BDM found with device name vda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:10:14.634197 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] No BDM found with device name hda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:10:14.634612 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] No VIF found with MAC fa:16:3e:00:56:50, not building metadata {{(pid=107505) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12068}} Aug 30 14:10:14.635478 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Using config drive Aug 30 14:10:14.672577 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] rbd image 10be0cac-53da-4f6f-acba-455135d2b5be_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:10:14.679118 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-2c54e719-2bfd-4498-8825-f68c954e675d req-5e48f8cd-d2a6-4467-9288-d7c8ac27cd22 service nova] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Updated VIF entry in instance network info cache for port bcef4154-0fd0-485b-8249-0c3a3fb4e45f. {{(pid=107505) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} Aug 30 14:10:14.679885 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-2c54e719-2bfd-4498-8825-f68c954e675d req-5e48f8cd-d2a6-4467-9288-d7c8ac27cd22 service nova] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Updating instance_info_cache with network_info: [{"id": "bcef4154-0fd0-485b-8249-0c3a3fb4e45f", "address": "fa:16:3e:00:56:50", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.61", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbcef4154-0f", "ovs_interfaceid": "bcef4154-0fd0-485b-8249-0c3a3fb4e45f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:10:14.826833 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-2c54e719-2bfd-4498-8825-f68c954e675d req-5e48f8cd-d2a6-4467-9288-d7c8ac27cd22 service nova] Releasing lock "refresh_cache-10be0cac-53da-4f6f-acba-455135d2b5be" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:10:15.036714 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:10:15.051452 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [-] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Updating instance_info_cache with network_info: [] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:10:15.070996 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Creating config drive at /opt/stack/data/nova/instances/10be0cac-53da-4f6f-acba-455135d2b5be/disk.config Aug 30 14:10:15.075281 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Running cmd (subprocess): genisoimage -o /opt/stack/data/nova/instances/10be0cac-53da-4f6f-acba-455135d2b5be/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 27.1.0 -quiet -J -r -V config-2 /tmp/tmpiyh0cgl_ {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:10:15.107021 np0035104604 nova-compute[107505]: INFO nova.compute.manager [-] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Took 1.14 seconds to deallocate network for instance. Aug 30 14:10:15.117672 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] CMD "genisoimage -o /opt/stack/data/nova/instances/10be0cac-53da-4f6f-acba-455135d2b5be/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 27.1.0 -quiet -J -r -V config-2 /tmp/tmpiyh0cgl_" returned: 0 in 0.042s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:10:15.149413 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] rbd image 10be0cac-53da-4f6f-acba-455135d2b5be_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:10:15.153501 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/10be0cac-53da-4f6f-acba-455135d2b5be/disk.config 10be0cac-53da-4f6f-acba-455135d2b5be_disk.config --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:10:15.184379 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-8a9eaeae-8e0a-4366-bf2b-2412dc1d2890 tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:10:15.185232 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-8a9eaeae-8e0a-4366-bf2b-2412dc1d2890 tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:10:15.345516 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] CMD "rbd import --pool vms /opt/stack/data/nova/instances/10be0cac-53da-4f6f-acba-455135d2b5be/disk.config 10be0cac-53da-4f6f-acba-455135d2b5be_disk.config --image-format=2 --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.192s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:10:15.346260 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Deleting local config drive /opt/stack/data/nova/instances/10be0cac-53da-4f6f-acba-455135d2b5be/disk.config because it was imported into RBD. Aug 30 14:10:15.524656 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:15.765808 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-55f761d0-142f-4183-999c-b8912f2c9b86 req-dd3fdd30-b9b5-44e3-8ed0-164df71a96c0 service nova] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Received event network-vif-plugged-bcef4154-0fd0-485b-8249-0c3a3fb4e45f {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:10:15.766648 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-55f761d0-142f-4183-999c-b8912f2c9b86 req-dd3fdd30-b9b5-44e3-8ed0-164df71a96c0 service nova] Acquiring lock "10be0cac-53da-4f6f-acba-455135d2b5be-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:10:15.767152 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-55f761d0-142f-4183-999c-b8912f2c9b86 req-dd3fdd30-b9b5-44e3-8ed0-164df71a96c0 service nova] Lock "10be0cac-53da-4f6f-acba-455135d2b5be-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:10:15.767616 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-55f761d0-142f-4183-999c-b8912f2c9b86 req-dd3fdd30-b9b5-44e3-8ed0-164df71a96c0 service nova] Lock "10be0cac-53da-4f6f-acba-455135d2b5be-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:10:15.768768 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-55f761d0-142f-4183-999c-b8912f2c9b86 req-dd3fdd30-b9b5-44e3-8ed0-164df71a96c0 service nova] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Processing event network-vif-plugged-bcef4154-0fd0-485b-8249-0c3a3fb4e45f {{(pid=107505) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10792}} Aug 30 14:10:15.955755 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:15.982489 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-8a9eaeae-8e0a-4366-bf2b-2412dc1d2890 tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:10:16.179409 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-21a809b3-38b9-47a3-b51b-ee6ae8c224f2 req-d116a54e-5a05-4a4c-b12a-99226a1ac4c0 service nova] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Received event network-vif-deleted-4c8c467b-1126-49ec-9cd3-648ecd61b8c7 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:10:16.478895 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Started> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:10:16.479110 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] VM Started (Lifecycle Event) Aug 30 14:10:16.489692 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Instance event wait completed in 0 seconds for network-vif-plugged {{(pid=107505) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Aug 30 14:10:16.494159 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Guest created on hypervisor {{(pid=107505) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4411}} Aug 30 14:10:16.497656 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Instance spawned successfully. Aug 30 14:10:16.499807 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:894}} Aug 30 14:10:16.509981 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:10:16.514301 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:10:16.538457 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Found default for hw_cdrom_bus of ide {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:10:16.538914 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Found default for hw_disk_bus of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:10:16.539734 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Found default for hw_input_bus of None {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:10:16.540046 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Found default for hw_pointer_model of None {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:10:16.540844 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Found default for hw_video_model of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:10:16.541594 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Found default for hw_vif_model of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:10:16.548018 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] During sync_power_state the instance has a pending task (spawning). Skip. Aug 30 14:10:16.548487 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Paused> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:10:16.549472 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] VM Paused (Lifecycle Event) Aug 30 14:10:16.567409 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:10:16.572404 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Resumed> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:10:16.573089 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] VM Resumed (Lifecycle Event) Aug 30 14:10:16.614547 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:10:16.619399 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:10:16.635771 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Took 7.41 seconds to spawn the instance on the hypervisor. Aug 30 14:10:16.636213 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:10:16.639481 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] During sync_power_state the instance has a pending task (spawning). Skip. Aug 30 14:10:16.729438 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Took 10.31 seconds to build instance. Aug 30 14:10:16.744615 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-8a9eaeae-8e0a-4366-bf2b-2412dc1d2890 tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.762s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:10:16.750994 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-444cf97e-cae1-4645-8160-c1be8f7f1740 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Lock "10be0cac-53da-4f6f-acba-455135d2b5be" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 10.428s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:10:16.753155 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-8a9eaeae-8e0a-4366-bf2b-2412dc1d2890 tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:10:16.767659 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-8a9eaeae-8e0a-4366-bf2b-2412dc1d2890 tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:10:16.800649 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-8a9eaeae-8e0a-4366-bf2b-2412dc1d2890 tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.616s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:10:16.985233 np0035104604 nova-compute[107505]: INFO nova.scheduler.client.report [None req-8a9eaeae-8e0a-4366-bf2b-2412dc1d2890 tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Deleted allocations for instance 181b742c-8cfa-491a-abac-26a7ade35dcf Aug 30 14:10:17.024727 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:10:17.081237 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-8a9eaeae-8e0a-4366-bf2b-2412dc1d2890 tempest-AttachInterfacesV270Test-2030737557 tempest-AttachInterfacesV270Test-2030737557-project-member] Lock "181b742c-8cfa-491a-abac-26a7ade35dcf" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 4.366s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:10:17.793482 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-3534b1c0-f3b9-4b6b-be37-b10aa21709f5 req-23e04675-8bf7-48b0-9dec-0b612903d2e9 service nova] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Received event network-vif-plugged-bcef4154-0fd0-485b-8249-0c3a3fb4e45f {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:10:17.794041 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-3534b1c0-f3b9-4b6b-be37-b10aa21709f5 req-23e04675-8bf7-48b0-9dec-0b612903d2e9 service nova] Acquiring lock "10be0cac-53da-4f6f-acba-455135d2b5be-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:10:17.794443 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-3534b1c0-f3b9-4b6b-be37-b10aa21709f5 req-23e04675-8bf7-48b0-9dec-0b612903d2e9 service nova] Lock "10be0cac-53da-4f6f-acba-455135d2b5be-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:10:17.795002 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-3534b1c0-f3b9-4b6b-be37-b10aa21709f5 req-23e04675-8bf7-48b0-9dec-0b612903d2e9 service nova] Lock "10be0cac-53da-4f6f-acba-455135d2b5be-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:10:17.795313 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-3534b1c0-f3b9-4b6b-be37-b10aa21709f5 req-23e04675-8bf7-48b0-9dec-0b612903d2e9 service nova] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] No waiting events found dispatching network-vif-plugged-bcef4154-0fd0-485b-8249-0c3a3fb4e45f {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:10:17.795748 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [req-3534b1c0-f3b9-4b6b-be37-b10aa21709f5 req-23e04675-8bf7-48b0-9dec-0b612903d2e9 service nova] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Received unexpected event network-vif-plugged-bcef4154-0fd0-485b-8249-0c3a3fb4e45f for instance with vm_state active and task_state None. Aug 30 14:10:17.840786 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-69d4c279-00b5-4946-900a-2d2195b65349 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Acquiring lock "10be0cac-53da-4f6f-acba-455135d2b5be" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:10:17.841111 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-69d4c279-00b5-4946-900a-2d2195b65349 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Lock "10be0cac-53da-4f6f-acba-455135d2b5be" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:10:17.841597 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-69d4c279-00b5-4946-900a-2d2195b65349 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Acquiring lock "10be0cac-53da-4f6f-acba-455135d2b5be-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:10:17.842000 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-69d4c279-00b5-4946-900a-2d2195b65349 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Lock "10be0cac-53da-4f6f-acba-455135d2b5be-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:10:17.842378 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-69d4c279-00b5-4946-900a-2d2195b65349 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Lock "10be0cac-53da-4f6f-acba-455135d2b5be-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:10:17.845062 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-69d4c279-00b5-4946-900a-2d2195b65349 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Terminating instance Aug 30 14:10:17.847576 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-69d4c279-00b5-4946-900a-2d2195b65349 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Start destroying the instance on the hypervisor. {{(pid=107505) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} Aug 30 14:10:17.910832 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:17.926859 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:17.936253 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:17.962828 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:17.969931 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:18.090898 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Instance destroyed successfully. Aug 30 14:10:18.091802 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-69d4c279-00b5-4946-900a-2d2195b65349 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Lazy-loading 'resources' on Instance uuid 10be0cac-53da-4f6f-acba-455135d2b5be {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:10:18.104513 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-69d4c279-00b5-4946-900a-2d2195b65349 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2023-08-30T14:10:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-TenantUsagesTestJSON-server-1488569924',display_name='tempest-TenantUsagesTestJSON-server-1488569924',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-tenantusagestestjson-server-1488569924',id=15,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-08-30T14:10:16Z,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='b4618fbd680046438e24accb6411f011',ramdisk_id='',reservation_id='r-zldutu1y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-TenantUsagesTestJSON-1222411741',owner_user_name='tempest-TenantUsagesTestJSON-1222411741-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-08-30T14:10:17Z,user_data=None,user_id='68051cbd26d345158739406b63f39f73',uuid=10be0cac-53da-4f6f-acba-455135d2b5be,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bcef4154-0fd0-485b-8249-0c3a3fb4e45f", "address": "fa:16:3e:00:56:50", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.61", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbcef4154-0f", "ovs_interfaceid": "bcef4154-0fd0-485b-8249-0c3a3fb4e45f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Aug 30 14:10:18.105058 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-69d4c279-00b5-4946-900a-2d2195b65349 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Converting VIF {"id": "bcef4154-0fd0-485b-8249-0c3a3fb4e45f", "address": "fa:16:3e:00:56:50", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.61", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbcef4154-0f", "ovs_interfaceid": "bcef4154-0fd0-485b-8249-0c3a3fb4e45f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:10:18.106487 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-69d4c279-00b5-4946-900a-2d2195b65349 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:00:56:50,bridge_name='br-int',has_traffic_filtering=True,id=bcef4154-0fd0-485b-8249-0c3a3fb4e45f,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbcef4154-0f') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:10:18.107233 np0035104604 nova-compute[107505]: DEBUG os_vif [None req-69d4c279-00b5-4946-900a-2d2195b65349 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:00:56:50,bridge_name='br-int',has_traffic_filtering=True,id=bcef4154-0fd0-485b-8249-0c3a3fb4e45f,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbcef4154-0f') {{(pid=107505) unplug /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:109}} Aug 30 14:10:18.110528 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:18.111091 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbcef4154-0f, bridge=br-int, if_exists=True) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:10:18.113297 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:18.118715 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:10:18.122566 np0035104604 nova-compute[107505]: INFO os_vif [None req-69d4c279-00b5-4946-900a-2d2195b65349 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:00:56:50,bridge_name='br-int',has_traffic_filtering=True,id=bcef4154-0fd0-485b-8249-0c3a3fb4e45f,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbcef4154-0f') Aug 30 14:10:18.582148 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-69d4c279-00b5-4946-900a-2d2195b65349 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Deleting instance files /opt/stack/data/nova/instances/10be0cac-53da-4f6f-acba-455135d2b5be_del Aug 30 14:10:18.582667 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-69d4c279-00b5-4946-900a-2d2195b65349 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Deletion of /opt/stack/data/nova/instances/10be0cac-53da-4f6f-acba-455135d2b5be_del complete Aug 30 14:10:18.658114 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-69d4c279-00b5-4946-900a-2d2195b65349 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Took 0.81 seconds to destroy the instance on the hypervisor. Aug 30 14:10:18.658870 np0035104604 nova-compute[107505]: DEBUG oslo.service.loopingcall [None req-69d4c279-00b5-4946-900a-2d2195b65349 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=107505) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} Aug 30 14:10:18.659161 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [-] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Deallocating network for instance {{(pid=107505) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} Aug 30 14:10:18.659801 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [-] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] deallocate_for_instance() {{(pid=107505) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} Aug 30 14:10:19.212380 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [-] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Updating instance_info_cache with network_info: [] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:10:19.228315 np0035104604 nova-compute[107505]: INFO nova.compute.manager [-] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Took 0.57 seconds to deallocate network for instance. Aug 30 14:10:19.279669 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-69d4c279-00b5-4946-900a-2d2195b65349 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:10:19.280013 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-69d4c279-00b5-4946-900a-2d2195b65349 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:10:19.681618 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:19.713560 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-69d4c279-00b5-4946-900a-2d2195b65349 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:10:19.864164 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-6f9b21bf-9d8d-4504-80dc-97e02e893d8a req-3f38c14a-69f9-45fe-ab1e-5cd197c9330a service nova] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Received event network-vif-unplugged-bcef4154-0fd0-485b-8249-0c3a3fb4e45f {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:10:19.864914 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-6f9b21bf-9d8d-4504-80dc-97e02e893d8a req-3f38c14a-69f9-45fe-ab1e-5cd197c9330a service nova] Acquiring lock "10be0cac-53da-4f6f-acba-455135d2b5be-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:10:19.868957 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-6f9b21bf-9d8d-4504-80dc-97e02e893d8a req-3f38c14a-69f9-45fe-ab1e-5cd197c9330a service nova] Lock "10be0cac-53da-4f6f-acba-455135d2b5be-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.004s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:10:19.869443 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-6f9b21bf-9d8d-4504-80dc-97e02e893d8a req-3f38c14a-69f9-45fe-ab1e-5cd197c9330a service nova] Lock "10be0cac-53da-4f6f-acba-455135d2b5be-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:10:19.870292 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-6f9b21bf-9d8d-4504-80dc-97e02e893d8a req-3f38c14a-69f9-45fe-ab1e-5cd197c9330a service nova] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] No waiting events found dispatching network-vif-unplugged-bcef4154-0fd0-485b-8249-0c3a3fb4e45f {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:10:19.870930 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [req-6f9b21bf-9d8d-4504-80dc-97e02e893d8a req-3f38c14a-69f9-45fe-ab1e-5cd197c9330a service nova] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Received unexpected event network-vif-unplugged-bcef4154-0fd0-485b-8249-0c3a3fb4e45f for instance with vm_state deleted and task_state None. Aug 30 14:10:19.871502 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-6f9b21bf-9d8d-4504-80dc-97e02e893d8a req-3f38c14a-69f9-45fe-ab1e-5cd197c9330a service nova] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Received event network-vif-plugged-bcef4154-0fd0-485b-8249-0c3a3fb4e45f {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:10:19.872250 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-6f9b21bf-9d8d-4504-80dc-97e02e893d8a req-3f38c14a-69f9-45fe-ab1e-5cd197c9330a service nova] Acquiring lock "10be0cac-53da-4f6f-acba-455135d2b5be-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:10:19.872681 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-6f9b21bf-9d8d-4504-80dc-97e02e893d8a req-3f38c14a-69f9-45fe-ab1e-5cd197c9330a service nova] Lock "10be0cac-53da-4f6f-acba-455135d2b5be-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:10:19.873141 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-6f9b21bf-9d8d-4504-80dc-97e02e893d8a req-3f38c14a-69f9-45fe-ab1e-5cd197c9330a service nova] Lock "10be0cac-53da-4f6f-acba-455135d2b5be-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:10:19.873556 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-6f9b21bf-9d8d-4504-80dc-97e02e893d8a req-3f38c14a-69f9-45fe-ab1e-5cd197c9330a service nova] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] No waiting events found dispatching network-vif-plugged-bcef4154-0fd0-485b-8249-0c3a3fb4e45f {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:10:19.874041 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [req-6f9b21bf-9d8d-4504-80dc-97e02e893d8a req-3f38c14a-69f9-45fe-ab1e-5cd197c9330a service nova] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Received unexpected event network-vif-plugged-bcef4154-0fd0-485b-8249-0c3a3fb4e45f for instance with vm_state deleted and task_state None. Aug 30 14:10:19.874521 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-6f9b21bf-9d8d-4504-80dc-97e02e893d8a req-3f38c14a-69f9-45fe-ab1e-5cd197c9330a service nova] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Received event network-vif-deleted-bcef4154-0fd0-485b-8249-0c3a3fb4e45f {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:10:20.456061 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-69d4c279-00b5-4946-900a-2d2195b65349 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.742s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:10:20.462282 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-69d4c279-00b5-4946-900a-2d2195b65349 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:10:20.478358 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-69d4c279-00b5-4946-900a-2d2195b65349 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:10:20.515016 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-69d4c279-00b5-4946-900a-2d2195b65349 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.235s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:10:20.685336 np0035104604 nova-compute[107505]: INFO nova.scheduler.client.report [None req-69d4c279-00b5-4946-900a-2d2195b65349 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Deleted allocations for instance 10be0cac-53da-4f6f-acba-455135d2b5be Aug 30 14:10:20.769961 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-69d4c279-00b5-4946-900a-2d2195b65349 tempest-TenantUsagesTestJSON-1222411741 tempest-TenantUsagesTestJSON-1222411741-project-member] Lock "10be0cac-53da-4f6f-acba-455135d2b5be" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.929s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:10:23.113301 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:28.054602 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:10:28.055307 np0035104604 nova-compute[107505]: INFO nova.compute.manager [-] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] VM Stopped (Lifecycle Event) Aug 30 14:10:28.074389 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-d9cdf16d-5adb-4291-a6ba-60cd6f3c61c9 None None] [instance: 181b742c-8cfa-491a-abac-26a7ade35dcf] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:10:28.114747 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:10:28.115170 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:28.115333 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe {{(pid=107505) run /opt/stack/data/venv/lib/python3.10/site-packages/ovs/reconnect.py:117}} Aug 30 14:10:28.115562 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=107505) _transition /opt/stack/data/venv/lib/python3.10/site-packages/ovs/reconnect.py:519}} Aug 30 14:10:28.116220 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=107505) _transition /opt/stack/data/venv/lib/python3.10/site-packages/ovs/reconnect.py:519}} Aug 30 14:10:28.116581 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:28.125534 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:28.708003 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:29.664902 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:31.657190 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:32.314840 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:33.087946 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:10:33.088818 np0035104604 nova-compute[107505]: INFO nova.compute.manager [-] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] VM Stopped (Lifecycle Event) Aug 30 14:10:33.117422 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:33.125645 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-28846989-801d-419d-a3c9-1ed4cd05a9ee None None] [instance: 10be0cac-53da-4f6f-acba-455135d2b5be] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:10:33.718361 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:35.820098 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Acquiring lock "d3e215ac-b5f4-4d63-a3b7-22c9c3720570" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:10:35.820667 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Lock "d3e215ac-b5f4-4d63-a3b7-22c9c3720570" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:10:35.845210 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Starting instance... {{(pid=107505) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} Aug 30 14:10:36.064475 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:10:36.065224 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:10:36.073945 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=107505) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Aug 30 14:10:36.074413 np0035104604 nova-compute[107505]: INFO nova.compute.claims [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Claim successful on node np0035104604 Aug 30 14:10:36.475171 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:36.678609 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:10:37.329557 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.649s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:10:37.337964 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:10:37.357459 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:10:37.412651 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.347s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:10:37.413774 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Start building networks asynchronously for instance. {{(pid=107505) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} Aug 30 14:10:37.503471 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Allocating IP information in the background. {{(pid=107505) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} Aug 30 14:10:37.503805 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] allocate_for_instance() {{(pid=107505) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} Aug 30 14:10:37.625856 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Aug 30 14:10:37.651594 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Start building block device mappings for instance. {{(pid=107505) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} Aug 30 14:10:37.769328 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Start spawning the instance on the hypervisor. {{(pid=107505) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} Aug 30 14:10:37.771121 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Creating instance directory {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4717}} Aug 30 14:10:37.771879 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Creating image(s) Aug 30 14:10:37.804351 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] rbd image d3e215ac-b5f4-4d63-a3b7-22c9c3720570_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:10:37.838332 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] rbd image d3e215ac-b5f4-4d63-a3b7-22c9c3720570_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:10:37.867011 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] rbd image d3e215ac-b5f4-4d63-a3b7-22c9c3720570_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:10:37.871676 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e --force-share --output=json {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:10:37.896797 np0035104604 nova-compute[107505]: DEBUG nova.policy [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '99da94c3250e4df08bef191e04588171', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '13d6c819bc1543bbb5481c22adf548e8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=107505) authorize /opt/stack/nova/nova/policy.py:203}} Aug 30 14:10:38.037668 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] CMD "/opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e --force-share --output=json" returned: 0 in 0.166s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:10:38.038310 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Acquiring lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:10:38.039166 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:10:38.039590 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:10:38.073773 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] rbd image d3e215ac-b5f4-4d63-a3b7-22c9c3720570_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:10:38.078374 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e d3e215ac-b5f4-4d63-a3b7-22c9c3720570_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:10:38.122818 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:38.438198 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] CMD "rbd import --pool vms /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e d3e215ac-b5f4-4d63-a3b7-22c9c3720570_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.360s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:10:38.509561 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] resizing rbd image d3e215ac-b5f4-4d63-a3b7-22c9c3720570_disk to 1073741824 {{(pid=107505) resize /opt/stack/nova/nova/storage/rbd_utils.py:288}} Aug 30 14:10:38.617088 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Created local disks {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4851}} Aug 30 14:10:38.617528 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Ensure instance console log exists: /opt/stack/data/nova/instances/d3e215ac-b5f4-4d63-a3b7-22c9c3720570/console.log {{(pid=107505) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4603}} Aug 30 14:10:38.618095 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:10:38.618518 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:10:38.618816 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:10:38.726481 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:38.798685 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:39.495785 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Successfully created port: 09224fee-8a35-4c18-b3c4-8c55dc653c62 {{(pid=107505) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} Aug 30 14:10:40.386270 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Successfully updated port: 09224fee-8a35-4c18-b3c4-8c55dc653c62 {{(pid=107505) _update_port /opt/stack/nova/nova/network/neutron.py:586}} Aug 30 14:10:40.405694 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Acquiring lock "refresh_cache-d3e215ac-b5f4-4d63-a3b7-22c9c3720570" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:10:40.406148 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Acquired lock "refresh_cache-d3e215ac-b5f4-4d63-a3b7-22c9c3720570" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:10:40.406944 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Building network info cache for instance {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} Aug 30 14:10:40.563176 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-5c0c12d6-752e-449d-90ff-703905012862 req-9da80f50-57a0-4420-8821-cc7ecedfca2a service nova] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Received event network-changed-09224fee-8a35-4c18-b3c4-8c55dc653c62 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:10:40.563830 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-5c0c12d6-752e-449d-90ff-703905012862 req-9da80f50-57a0-4420-8821-cc7ecedfca2a service nova] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Refreshing instance network info cache due to event network-changed-09224fee-8a35-4c18-b3c4-8c55dc653c62. {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11037}} Aug 30 14:10:40.563830 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-5c0c12d6-752e-449d-90ff-703905012862 req-9da80f50-57a0-4420-8821-cc7ecedfca2a service nova] Acquiring lock "refresh_cache-d3e215ac-b5f4-4d63-a3b7-22c9c3720570" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:10:40.616561 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Instance cache missing network info. {{(pid=107505) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} Aug 30 14:10:41.524439 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Updating instance_info_cache with network_info: [{"id": "09224fee-8a35-4c18-b3c4-8c55dc653c62", "address": "fa:16:3e:26:70:26", "network": {"id": "9a6c7309-6d00-4e31-ab3a-4b23c80f5c66", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1690993988-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "13d6c819bc1543bbb5481c22adf548e8", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap09224fee-8a", "ovs_interfaceid": "09224fee-8a35-4c18-b3c4-8c55dc653c62", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:10:41.544928 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Releasing lock "refresh_cache-d3e215ac-b5f4-4d63-a3b7-22c9c3720570" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:10:41.545360 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Instance network_info: |[{"id": "09224fee-8a35-4c18-b3c4-8c55dc653c62", "address": "fa:16:3e:26:70:26", "network": {"id": "9a6c7309-6d00-4e31-ab3a-4b23c80f5c66", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1690993988-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "13d6c819bc1543bbb5481c22adf548e8", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap09224fee-8a", "ovs_interfaceid": "09224fee-8a35-4c18-b3c4-8c55dc653c62", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=107505) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} Aug 30 14:10:41.545951 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-5c0c12d6-752e-449d-90ff-703905012862 req-9da80f50-57a0-4420-8821-cc7ecedfca2a service nova] Acquired lock "refresh_cache-d3e215ac-b5f4-4d63-a3b7-22c9c3720570" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:10:41.546271 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-5c0c12d6-752e-449d-90ff-703905012862 req-9da80f50-57a0-4420-8821-cc7ecedfca2a service nova] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Refreshing network info cache for port 09224fee-8a35-4c18-b3c4-8c55dc653c62 {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} Aug 30 14:10:41.552152 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Start _get_guest_xml network_info=[{"id": "09224fee-8a35-4c18-b3c4-8c55dc653c62", "address": "fa:16:3e:26:70:26", "network": {"id": "9a6c7309-6d00-4e31-ab3a-4b23c80f5c66", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1690993988-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "13d6c819bc1543bbb5481c22adf548e8", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap09224fee-8a", "ovs_interfaceid": "09224fee-8a35-4c18-b3c4-8c55dc653c62", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'ide', 'dev': 'hda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-08-30T14:01:32Z,direct_url=,disk_format='qcow2',id=a32d892d-d3ba-4b7e-9bab-7e06f730b9e0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='7f1ed4be4a61468fa9c662d2b5fbcb56',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-08-30T14:01:43Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'disk_bus': 'virtio', 'image_id': 'a32d892d-d3ba-4b7e-9bab-7e06f730b9e0'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=107505) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7553}} Aug 30 14:10:41.557649 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Aug 30 14:10:41.564333 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Searching host: 'np0035104604' for CPU controller through CGroups V1... {{(pid=107505) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1650}} Aug 30 14:10:41.565072 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] CPU controller missing on host. {{(pid=107505) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1660}} Aug 30 14:10:41.725384 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Searching host: 'np0035104604' for CPU controller through CGroups V2... {{(pid=107505) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1669}} Aug 30 14:10:41.725716 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] CPU controller found on host. {{(pid=107505) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1676}} Aug 30 14:10:41.727356 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=107505) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5390}} Aug 30 14:10:41.728029 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-08-30T14:02:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-08-30T14:01:32Z,direct_url=,disk_format='qcow2',id=a32d892d-d3ba-4b7e-9bab-7e06f730b9e0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='7f1ed4be4a61468fa9c662d2b5fbcb56',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-08-30T14:01:43Z,virtual_size=,visibility=), allow threads: True {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Aug 30 14:10:41.728465 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Flavor limits 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Aug 30 14:10:41.728833 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Image limits 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Aug 30 14:10:41.729286 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Flavor pref 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Aug 30 14:10:41.729690 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Image pref 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Aug 30 14:10:41.730118 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Aug 30 14:10:41.730598 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Aug 30 14:10:41.731011 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=107505) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Aug 30 14:10:41.731458 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Got 1 possible topologies {{(pid=107505) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Aug 30 14:10:41.731903 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Aug 30 14:10:41.732535 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Aug 30 14:10:41.748375 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:10:42.541661 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.793s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:10:42.573296 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] rbd image d3e215ac-b5f4-4d63-a3b7-22c9c3720570_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:10:42.577337 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:10:42.968850 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Acquiring lock "a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:10:42.969341 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Lock "a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:10:43.038160 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-5c0c12d6-752e-449d-90ff-703905012862 req-9da80f50-57a0-4420-8821-cc7ecedfca2a service nova] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Updated VIF entry in instance network info cache for port 09224fee-8a35-4c18-b3c4-8c55dc653c62. {{(pid=107505) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} Aug 30 14:10:43.038914 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-5c0c12d6-752e-449d-90ff-703905012862 req-9da80f50-57a0-4420-8821-cc7ecedfca2a service nova] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Updating instance_info_cache with network_info: [{"id": "09224fee-8a35-4c18-b3c4-8c55dc653c62", "address": "fa:16:3e:26:70:26", "network": {"id": "9a6c7309-6d00-4e31-ab3a-4b23c80f5c66", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1690993988-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "13d6c819bc1543bbb5481c22adf548e8", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap09224fee-8a", "ovs_interfaceid": "09224fee-8a35-4c18-b3c4-8c55dc653c62", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:10:43.052165 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Starting instance... {{(pid=107505) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} Aug 30 14:10:43.057835 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-5c0c12d6-752e-449d-90ff-703905012862 req-9da80f50-57a0-4420-8821-cc7ecedfca2a service nova] Releasing lock "refresh_cache-d3e215ac-b5f4-4d63-a3b7-22c9c3720570" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:10:43.130217 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:43.259057 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.681s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:10:43.260992 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-08-30T14:10:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestManualDisk-server-1906138070',display_name='tempest-ServersTestManualDisk-server-1906138070',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-serverstestmanualdisk-server-1906138070',id=16,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPvXTXjVdhbUpv92S2TvfzMPoYC1l9OHahiBNeQMj3dh0MCCgAqnLpeVXL58D1mC9IzKrvbUKlqfO/pFjAzxhlD0s9T6CISLPK22eqeT5xx1Ul8L+CUPcem/Mk0/tnDjiA==',key_name='tempest-keypair-230544950',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=None,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='13d6c819bc1543bbb5481c22adf548e8',ramdisk_id='',reservation_id='r-57raj7c0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersTestManualDisk-57483657',owner_user_name='tempest-ServersTestManualDisk-57483657-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-08-30T14:10:38Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='99da94c3250e4df08bef191e04588171',uuid=d3e215ac-b5f4-4d63-a3b7-22c9c3720570,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "09224fee-8a35-4c18-b3c4-8c55dc653c62", "address": "fa:16:3e:26:70:26", "network": {"id": "9a6c7309-6d00-4e31-ab3a-4b23c80f5c66", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1690993988-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "13d6c819bc1543bbb5481c22adf548e8", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap09224fee-8a", "ovs_interfaceid": "09224fee-8a35-4c18-b3c4-8c55dc653c62", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=qemu {{(pid=107505) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Aug 30 14:10:43.261457 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Converting VIF {"id": "09224fee-8a35-4c18-b3c4-8c55dc653c62", "address": "fa:16:3e:26:70:26", "network": {"id": "9a6c7309-6d00-4e31-ab3a-4b23c80f5c66", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1690993988-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "13d6c819bc1543bbb5481c22adf548e8", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap09224fee-8a", "ovs_interfaceid": "09224fee-8a35-4c18-b3c4-8c55dc653c62", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:10:43.262668 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:26:70:26,bridge_name='br-int',has_traffic_filtering=True,id=09224fee-8a35-4c18-b3c4-8c55dc653c62,network=Network(9a6c7309-6d00-4e31-ab3a-4b23c80f5c66),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09224fee-8a') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:10:43.263936 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Lazy-loading 'pci_devices' on Instance uuid d3e215ac-b5f4-4d63-a3b7-22c9c3720570 {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] End _get_guest_xml xml= Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: d3e215ac-b5f4-4d63-a3b7-22c9c3720570 Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: instance-00000010 Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: 131072 Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: 1 Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: tempest-ServersTestManualDisk-server-1906138070 Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: 2023-08-30 14:10:41 Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: 128 Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: 1 Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: 0 Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: 0 Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: 1 Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: tempest-ServersTestManualDisk-57483657-project-member Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: tempest-ServersTestManualDisk-57483657 Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: OpenStack Foundation Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: OpenStack Nova Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: 27.1.0 Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: d3e215ac-b5f4-4d63-a3b7-22c9c3720570 Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: d3e215ac-b5f4-4d63-a3b7-22c9c3720570 Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Virtual Machine Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: hvm Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Nehalem Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: /dev/urandom Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: Aug 30 14:10:43.277226 np0035104604 nova-compute[107505]: {{(pid=107505) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7559}} Aug 30 14:10:43.286511 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Preparing to wait for external event network-vif-plugged-09224fee-8a35-4c18-b3c4-8c55dc653c62 {{(pid=107505) prepare_for_instance_event /opt/stack/nova/nova/compute/manager.py:283}} Aug 30 14:10:43.286511 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Acquiring lock "d3e215ac-b5f4-4d63-a3b7-22c9c3720570-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:10:43.286511 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Lock "d3e215ac-b5f4-4d63-a3b7-22c9c3720570-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:10:43.286511 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Lock "d3e215ac-b5f4-4d63-a3b7-22c9c3720570-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:10:43.286511 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-08-30T14:10:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestManualDisk-server-1906138070',display_name='tempest-ServersTestManualDisk-server-1906138070',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-serverstestmanualdisk-server-1906138070',id=16,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPvXTXjVdhbUpv92S2TvfzMPoYC1l9OHahiBNeQMj3dh0MCCgAqnLpeVXL58D1mC9IzKrvbUKlqfO/pFjAzxhlD0s9T6CISLPK22eqeT5xx1Ul8L+CUPcem/Mk0/tnDjiA==',key_name='tempest-keypair-230544950',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=None,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='13d6c819bc1543bbb5481c22adf548e8',ramdisk_id='',reservation_id='r-57raj7c0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersTestManualDisk-57483657',owner_user_name='tempest-ServersTestManualDisk-57483657-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-08-30T14:10:38Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='99da94c3250e4df08bef191e04588171',uuid=d3e215ac-b5f4-4d63-a3b7-22c9c3720570,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "09224fee-8a35-4c18-b3c4-8c55dc653c62", "address": "fa:16:3e:26:70:26", "network": {"id": "9a6c7309-6d00-4e31-ab3a-4b23c80f5c66", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1690993988-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "13d6c819bc1543bbb5481c22adf548e8", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap09224fee-8a", "ovs_interfaceid": "09224fee-8a35-4c18-b3c4-8c55dc653c62", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Aug 30 14:10:43.287340 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Converting VIF {"id": "09224fee-8a35-4c18-b3c4-8c55dc653c62", "address": "fa:16:3e:26:70:26", "network": {"id": "9a6c7309-6d00-4e31-ab3a-4b23c80f5c66", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1690993988-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "13d6c819bc1543bbb5481c22adf548e8", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap09224fee-8a", "ovs_interfaceid": "09224fee-8a35-4c18-b3c4-8c55dc653c62", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:10:43.287340 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:26:70:26,bridge_name='br-int',has_traffic_filtering=True,id=09224fee-8a35-4c18-b3c4-8c55dc653c62,network=Network(9a6c7309-6d00-4e31-ab3a-4b23c80f5c66),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09224fee-8a') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:10:43.287340 np0035104604 nova-compute[107505]: DEBUG os_vif [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:70:26,bridge_name='br-int',has_traffic_filtering=True,id=09224fee-8a35-4c18-b3c4-8c55dc653c62,network=Network(9a6c7309-6d00-4e31-ab3a-4b23c80f5c66),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09224fee-8a') {{(pid=107505) plug /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:76}} Aug 30 14:10:43.287340 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:43.287340 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:10:43.287340 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Aug 30 14:10:43.288912 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:43.288912 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap09224fee-8a, may_exist=True, interface_attrs={}) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:10:43.290992 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap09224fee-8a, col_values=(('external_ids', {'iface-id': '09224fee-8a35-4c18-b3c4-8c55dc653c62', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:26:70:26', 'vm-uuid': 'd3e215ac-b5f4-4d63-a3b7-22c9c3720570'}),), if_exists=True) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:10:43.291793 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:43.305248 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:10:43.305869 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:10:43.307154 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:10:43.308151 np0035104604 nova-compute[107505]: INFO os_vif [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:70:26,bridge_name='br-int',has_traffic_filtering=True,id=09224fee-8a35-4c18-b3c4-8c55dc653c62,network=Network(9a6c7309-6d00-4e31-ab3a-4b23c80f5c66),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09224fee-8a') Aug 30 14:10:43.312621 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=107505) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Aug 30 14:10:43.313018 np0035104604 nova-compute[107505]: INFO nova.compute.claims [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Claim successful on node np0035104604 Aug 30 14:10:43.408716 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] No BDM found with device name vda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:10:43.409080 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] No BDM found with device name hda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:10:43.409459 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] No VIF found with MAC fa:16:3e:26:70:26, not building metadata {{(pid=107505) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12068}} Aug 30 14:10:43.410361 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Using config drive Aug 30 14:10:43.439292 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] rbd image d3e215ac-b5f4-4d63-a3b7-22c9c3720570_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:10:43.812775 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:44.212239 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:10:44.243055 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Creating config drive at /opt/stack/data/nova/instances/d3e215ac-b5f4-4d63-a3b7-22c9c3720570/disk.config Aug 30 14:10:44.247640 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Running cmd (subprocess): genisoimage -o /opt/stack/data/nova/instances/d3e215ac-b5f4-4d63-a3b7-22c9c3720570/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 27.1.0 -quiet -J -r -V config-2 /tmp/tmp0z3tqrz8 {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:10:44.281587 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] CMD "genisoimage -o /opt/stack/data/nova/instances/d3e215ac-b5f4-4d63-a3b7-22c9c3720570/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 27.1.0 -quiet -J -r -V config-2 /tmp/tmp0z3tqrz8" returned: 0 in 0.033s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:10:44.313834 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] rbd image d3e215ac-b5f4-4d63-a3b7-22c9c3720570_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:10:44.317392 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/d3e215ac-b5f4-4d63-a3b7-22c9c3720570/disk.config d3e215ac-b5f4-4d63-a3b7-22c9c3720570_disk.config --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:10:44.463257 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] CMD "rbd import --pool vms /opt/stack/data/nova/instances/d3e215ac-b5f4-4d63-a3b7-22c9c3720570/disk.config d3e215ac-b5f4-4d63-a3b7-22c9c3720570_disk.config --image-format=2 --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.145s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:10:44.464303 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Deleting local config drive /opt/stack/data/nova/instances/d3e215ac-b5f4-4d63-a3b7-22c9c3720570/disk.config because it was imported into RBD. Aug 30 14:10:44.486995 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:44.512152 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:44.515569 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:44.936660 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:44.954407 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-fd742b64-3d3f-41cf-9fde-fe5acb3cb806 req-a4b07766-348f-47ae-b81e-fddb65c619a5 service nova] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Received event network-vif-plugged-09224fee-8a35-4c18-b3c4-8c55dc653c62 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:10:44.954853 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-fd742b64-3d3f-41cf-9fde-fe5acb3cb806 req-a4b07766-348f-47ae-b81e-fddb65c619a5 service nova] Acquiring lock "d3e215ac-b5f4-4d63-a3b7-22c9c3720570-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:10:44.955215 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-fd742b64-3d3f-41cf-9fde-fe5acb3cb806 req-a4b07766-348f-47ae-b81e-fddb65c619a5 service nova] Lock "d3e215ac-b5f4-4d63-a3b7-22c9c3720570-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:10:44.955514 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-fd742b64-3d3f-41cf-9fde-fe5acb3cb806 req-a4b07766-348f-47ae-b81e-fddb65c619a5 service nova] Lock "d3e215ac-b5f4-4d63-a3b7-22c9c3720570-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:10:44.955859 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-fd742b64-3d3f-41cf-9fde-fe5acb3cb806 req-a4b07766-348f-47ae-b81e-fddb65c619a5 service nova] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Processing event network-vif-plugged-09224fee-8a35-4c18-b3c4-8c55dc653c62 {{(pid=107505) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10792}} Aug 30 14:10:44.956476 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:44.966494 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:45.099807 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.888s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:10:45.117427 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:10:45.136780 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:10:45.187919 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:45.202282 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.896s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:10:45.203023 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Start building networks asynchronously for instance. {{(pid=107505) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} Aug 30 14:10:45.270627 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Allocating IP information in the background. {{(pid=107505) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} Aug 30 14:10:45.271023 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] allocate_for_instance() {{(pid=107505) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} Aug 30 14:10:45.423203 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Aug 30 14:10:45.459384 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Start building block device mappings for instance. {{(pid=107505) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} Aug 30 14:10:45.510009 np0035104604 nova-compute[107505]: DEBUG nova.policy [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bf110227fdb94f70af671a2ffb9b4c04', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ea4a96dd5eeb416ca49cf98d07944952', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=107505) authorize /opt/stack/nova/nova/policy.py:203}} Aug 30 14:10:45.642198 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Started> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:10:45.650847 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] VM Started (Lifecycle Event) Aug 30 14:10:45.654234 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Instance event wait completed in 0 seconds for network-vif-plugged {{(pid=107505) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Aug 30 14:10:45.672557 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:10:45.673185 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Guest created on hypervisor {{(pid=107505) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4411}} Aug 30 14:10:45.683121 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:10:45.687600 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Instance spawned successfully. Aug 30 14:10:45.688068 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:894}} Aug 30 14:10:45.704329 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] During sync_power_state the instance has a pending task (spawning). Skip. Aug 30 14:10:45.704757 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Paused> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:10:45.704983 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] VM Paused (Lifecycle Event) Aug 30 14:10:45.837963 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:10:45.839416 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Start spawning the instance on the hypervisor. {{(pid=107505) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} Aug 30 14:10:45.840501 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Creating instance directory {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4717}} Aug 30 14:10:45.840938 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Creating image(s) Aug 30 14:10:45.875138 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] rbd image a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:10:45.922545 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] rbd image a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:10:45.984931 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] rbd image a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:10:45.992349 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e --force-share --output=json {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:10:46.033030 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Found default for hw_cdrom_bus of ide {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:10:46.033570 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Found default for hw_disk_bus of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:10:46.036340 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Found default for hw_input_bus of None {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:10:46.037285 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Found default for hw_pointer_model of None {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:10:46.038655 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Found default for hw_video_model of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:10:46.039768 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Found default for hw_vif_model of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:10:46.056601 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Resumed> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:10:46.056965 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] VM Resumed (Lifecycle Event) Aug 30 14:10:46.087783 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:10:46.092791 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:10:46.112973 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] CMD "/opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e --force-share --output=json" returned: 0 in 0.122s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:10:46.114065 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Acquiring lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:10:46.115190 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:10:46.115803 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:10:46.162540 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] rbd image a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:10:46.181430 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:10:46.206991 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] During sync_power_state the instance has a pending task (spawning). Skip. Aug 30 14:10:46.210653 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Took 8.44 seconds to spawn the instance on the hypervisor. Aug 30 14:10:46.211215 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:10:46.338580 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Took 10.42 seconds to build instance. Aug 30 14:10:46.361604 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-a7c98bc1-cd2c-4580-86ef-39af8d91e98d tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Lock "d3e215ac-b5f4-4d63-a3b7-22c9c3720570" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 10.541s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:10:46.536971 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] CMD "rbd import --pool vms /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.356s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:10:46.583373 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Successfully created port: 0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7 {{(pid=107505) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} Aug 30 14:10:46.797995 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] resizing rbd image a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f_disk to 1073741824 {{(pid=107505) resize /opt/stack/nova/nova/storage/rbd_utils.py:288}} Aug 30 14:10:46.917449 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Created local disks {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4851}} Aug 30 14:10:46.918082 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Ensure instance console log exists: /opt/stack/data/nova/instances/a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f/console.log {{(pid=107505) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4603}} Aug 30 14:10:46.918557 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:10:46.919306 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:10:46.919637 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:10:47.029359 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-c0aabcf8-c836-4694-9771-2bf8a5ab89f1 req-a462345b-284d-4ac5-8bf1-3e1390aaf2e6 service nova] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Received event network-vif-plugged-09224fee-8a35-4c18-b3c4-8c55dc653c62 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:10:47.030131 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-c0aabcf8-c836-4694-9771-2bf8a5ab89f1 req-a462345b-284d-4ac5-8bf1-3e1390aaf2e6 service nova] Acquiring lock "d3e215ac-b5f4-4d63-a3b7-22c9c3720570-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:10:47.030531 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-c0aabcf8-c836-4694-9771-2bf8a5ab89f1 req-a462345b-284d-4ac5-8bf1-3e1390aaf2e6 service nova] Lock "d3e215ac-b5f4-4d63-a3b7-22c9c3720570-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:10:47.030889 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-c0aabcf8-c836-4694-9771-2bf8a5ab89f1 req-a462345b-284d-4ac5-8bf1-3e1390aaf2e6 service nova] Lock "d3e215ac-b5f4-4d63-a3b7-22c9c3720570-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:10:47.031522 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-c0aabcf8-c836-4694-9771-2bf8a5ab89f1 req-a462345b-284d-4ac5-8bf1-3e1390aaf2e6 service nova] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] No waiting events found dispatching network-vif-plugged-09224fee-8a35-4c18-b3c4-8c55dc653c62 {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:10:47.031884 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [req-c0aabcf8-c836-4694-9771-2bf8a5ab89f1 req-a462345b-284d-4ac5-8bf1-3e1390aaf2e6 service nova] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Received unexpected event network-vif-plugged-09224fee-8a35-4c18-b3c4-8c55dc653c62 for instance with vm_state active and task_state None. Aug 30 14:10:47.891049 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Successfully updated port: 0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7 {{(pid=107505) _update_port /opt/stack/nova/nova/network/neutron.py:586}} Aug 30 14:10:47.906309 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Acquiring lock "refresh_cache-a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:10:47.906588 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Acquired lock "refresh_cache-a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:10:47.906980 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Building network info cache for instance {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} Aug 30 14:10:48.108910 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-ffcaee2c-19c7-4f5f-ad30-2b082d5624a1 req-c495386f-fa51-4bb5-bdf7-ef31f3c864dc service nova] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Received event network-changed-0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:10:48.109577 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-ffcaee2c-19c7-4f5f-ad30-2b082d5624a1 req-c495386f-fa51-4bb5-bdf7-ef31f3c864dc service nova] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Refreshing instance network info cache due to event network-changed-0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7. {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11037}} Aug 30 14:10:48.110224 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-ffcaee2c-19c7-4f5f-ad30-2b082d5624a1 req-c495386f-fa51-4bb5-bdf7-ef31f3c864dc service nova] Acquiring lock "refresh_cache-a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:10:48.291253 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:48.303620 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Instance cache missing network info. {{(pid=107505) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} Aug 30 14:10:48.823776 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:49.089950 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-4ce2ef43-6dc2-408d-a9f6-8e3c6cc400c5 req-87e63f40-c3c9-40fc-87f0-492cb9f87d95 service nova] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Received event network-changed-09224fee-8a35-4c18-b3c4-8c55dc653c62 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:10:49.089950 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-4ce2ef43-6dc2-408d-a9f6-8e3c6cc400c5 req-87e63f40-c3c9-40fc-87f0-492cb9f87d95 service nova] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Refreshing instance network info cache due to event network-changed-09224fee-8a35-4c18-b3c4-8c55dc653c62. {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11037}} Aug 30 14:10:49.089950 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-4ce2ef43-6dc2-408d-a9f6-8e3c6cc400c5 req-87e63f40-c3c9-40fc-87f0-492cb9f87d95 service nova] Acquiring lock "refresh_cache-d3e215ac-b5f4-4d63-a3b7-22c9c3720570" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:10:49.089950 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-4ce2ef43-6dc2-408d-a9f6-8e3c6cc400c5 req-87e63f40-c3c9-40fc-87f0-492cb9f87d95 service nova] Acquired lock "refresh_cache-d3e215ac-b5f4-4d63-a3b7-22c9c3720570" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:10:49.090935 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-4ce2ef43-6dc2-408d-a9f6-8e3c6cc400c5 req-87e63f40-c3c9-40fc-87f0-492cb9f87d95 service nova] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Refreshing network info cache for port 09224fee-8a35-4c18-b3c4-8c55dc653c62 {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} Aug 30 14:10:49.569079 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Updating instance_info_cache with network_info: [{"id": "0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7", "address": "fa:16:3e:24:9e:9b", "network": {"id": "fc683610-55d4-42da-b1c6-1b0e5b095492", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1217128506-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "ea4a96dd5eeb416ca49cf98d07944952", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0cb26a8d-2c", "ovs_interfaceid": "0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:10:49.687456 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Releasing lock "refresh_cache-a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:10:49.687877 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Instance network_info: |[{"id": "0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7", "address": "fa:16:3e:24:9e:9b", "network": {"id": "fc683610-55d4-42da-b1c6-1b0e5b095492", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1217128506-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "ea4a96dd5eeb416ca49cf98d07944952", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0cb26a8d-2c", "ovs_interfaceid": "0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=107505) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} Aug 30 14:10:49.688439 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-ffcaee2c-19c7-4f5f-ad30-2b082d5624a1 req-c495386f-fa51-4bb5-bdf7-ef31f3c864dc service nova] Acquired lock "refresh_cache-a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:10:49.688925 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-ffcaee2c-19c7-4f5f-ad30-2b082d5624a1 req-c495386f-fa51-4bb5-bdf7-ef31f3c864dc service nova] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Refreshing network info cache for port 0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7 {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} Aug 30 14:10:49.695091 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Start _get_guest_xml network_info=[{"id": "0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7", "address": "fa:16:3e:24:9e:9b", "network": {"id": "fc683610-55d4-42da-b1c6-1b0e5b095492", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1217128506-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "ea4a96dd5eeb416ca49cf98d07944952", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0cb26a8d-2c", "ovs_interfaceid": "0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'ide', 'dev': 'hda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-08-30T14:01:32Z,direct_url=,disk_format='qcow2',id=a32d892d-d3ba-4b7e-9bab-7e06f730b9e0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='7f1ed4be4a61468fa9c662d2b5fbcb56',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-08-30T14:01:43Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'disk_bus': 'virtio', 'image_id': 'a32d892d-d3ba-4b7e-9bab-7e06f730b9e0'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=107505) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7553}} Aug 30 14:10:50.075742 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Aug 30 14:10:50.079395 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Searching host: 'np0035104604' for CPU controller through CGroups V1... {{(pid=107505) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1650}} Aug 30 14:10:50.080363 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] CPU controller missing on host. {{(pid=107505) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1660}} Aug 30 14:10:50.084718 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Searching host: 'np0035104604' for CPU controller through CGroups V2... {{(pid=107505) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1669}} Aug 30 14:10:50.085234 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] CPU controller found on host. {{(pid=107505) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1676}} Aug 30 14:10:50.087016 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=107505) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5390}} Aug 30 14:10:50.087551 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-08-30T14:02:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-08-30T14:01:32Z,direct_url=,disk_format='qcow2',id=a32d892d-d3ba-4b7e-9bab-7e06f730b9e0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='7f1ed4be4a61468fa9c662d2b5fbcb56',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-08-30T14:01:43Z,virtual_size=,visibility=), allow threads: True {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Aug 30 14:10:50.087860 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Flavor limits 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Aug 30 14:10:50.088122 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Image limits 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Aug 30 14:10:50.088431 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Flavor pref 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Aug 30 14:10:50.088802 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Image pref 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Aug 30 14:10:50.088932 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Aug 30 14:10:50.089409 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Aug 30 14:10:50.089819 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=107505) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Aug 30 14:10:50.090234 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Got 1 possible topologies {{(pid=107505) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Aug 30 14:10:50.090553 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Aug 30 14:10:50.090884 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Aug 30 14:10:50.105182 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:10:50.966989 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.862s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:10:50.990592 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] rbd image a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:10:50.994258 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:10:51.016018 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-4ce2ef43-6dc2-408d-a9f6-8e3c6cc400c5 req-87e63f40-c3c9-40fc-87f0-492cb9f87d95 service nova] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Updated VIF entry in instance network info cache for port 09224fee-8a35-4c18-b3c4-8c55dc653c62. {{(pid=107505) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} Aug 30 14:10:51.016923 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-4ce2ef43-6dc2-408d-a9f6-8e3c6cc400c5 req-87e63f40-c3c9-40fc-87f0-492cb9f87d95 service nova] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Updating instance_info_cache with network_info: [{"id": "09224fee-8a35-4c18-b3c4-8c55dc653c62", "address": "fa:16:3e:26:70:26", "network": {"id": "9a6c7309-6d00-4e31-ab3a-4b23c80f5c66", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1690993988-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.5.35", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "13d6c819bc1543bbb5481c22adf548e8", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap09224fee-8a", "ovs_interfaceid": "09224fee-8a35-4c18-b3c4-8c55dc653c62", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:10:51.040282 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-4ce2ef43-6dc2-408d-a9f6-8e3c6cc400c5 req-87e63f40-c3c9-40fc-87f0-492cb9f87d95 service nova] Releasing lock "refresh_cache-d3e215ac-b5f4-4d63-a3b7-22c9c3720570" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:10:51.458996 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-ffcaee2c-19c7-4f5f-ad30-2b082d5624a1 req-c495386f-fa51-4bb5-bdf7-ef31f3c864dc service nova] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Updated VIF entry in instance network info cache for port 0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7. {{(pid=107505) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} Aug 30 14:10:51.459931 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-ffcaee2c-19c7-4f5f-ad30-2b082d5624a1 req-c495386f-fa51-4bb5-bdf7-ef31f3c864dc service nova] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Updating instance_info_cache with network_info: [{"id": "0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7", "address": "fa:16:3e:24:9e:9b", "network": {"id": "fc683610-55d4-42da-b1c6-1b0e5b095492", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1217128506-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "ea4a96dd5eeb416ca49cf98d07944952", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0cb26a8d-2c", "ovs_interfaceid": "0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:10:51.475863 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-ffcaee2c-19c7-4f5f-ad30-2b082d5624a1 req-c495386f-fa51-4bb5-bdf7-ef31f3c864dc service nova] Releasing lock "refresh_cache-a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:10:51.685178 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.691s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:10:51.687699 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-08-30T14:10:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-33891592',display_name='tempest-VolumesAdminNegativeTest-server-33891592',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-volumesadminnegativetest-server-33891592',id=17,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFNxfMlTgh2AlbGeKCoNjSLqZkOwAUAo8SvV+lBmCfnE0eFq8n65XVY/dOcrTg6YnZH7zyAoMgPMPwvyChNd5HG2vPgUcx8zoLGgorKdVMcmOEgjCJY+qxWvpP1j+79+yQ==',key_name='tempest-keypair-1240208729',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ea4a96dd5eeb416ca49cf98d07944952',ramdisk_id='',reservation_id='r-szs3hgql',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesAdminNegativeTest-416011200',owner_user_name='tempest-VolumesAdminNegativeTest-416011200-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-08-30T14:10:46Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='bf110227fdb94f70af671a2ffb9b4c04',uuid=a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7", "address": "fa:16:3e:24:9e:9b", "network": {"id": "fc683610-55d4-42da-b1c6-1b0e5b095492", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1217128506-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "ea4a96dd5eeb416ca49cf98d07944952", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0cb26a8d-2c", "ovs_interfaceid": "0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=qemu {{(pid=107505) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Aug 30 14:10:51.688641 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Converting VIF {"id": "0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7", "address": "fa:16:3e:24:9e:9b", "network": {"id": "fc683610-55d4-42da-b1c6-1b0e5b095492", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1217128506-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "ea4a96dd5eeb416ca49cf98d07944952", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0cb26a8d-2c", "ovs_interfaceid": "0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:10:51.690219 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:9e:9b,bridge_name='br-int',has_traffic_filtering=True,id=0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7,network=Network(fc683610-55d4-42da-b1c6-1b0e5b095492),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0cb26a8d-2c') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:10:51.692063 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Lazy-loading 'pci_devices' on Instance uuid a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] End _get_guest_xml xml= Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: instance-00000011 Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: 131072 Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: 1 Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: tempest-VolumesAdminNegativeTest-server-33891592 Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: 2023-08-30 14:10:50 Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: 128 Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: 1 Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: 0 Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: 0 Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: 1 Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: tempest-VolumesAdminNegativeTest-416011200-project-member Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: tempest-VolumesAdminNegativeTest-416011200 Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: OpenStack Foundation Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: OpenStack Nova Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: 27.1.0 Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Virtual Machine Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: hvm Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Nehalem Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: /dev/urandom Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: Aug 30 14:10:51.710142 np0035104604 nova-compute[107505]: {{(pid=107505) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7559}} Aug 30 14:10:51.719574 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Preparing to wait for external event network-vif-plugged-0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7 {{(pid=107505) prepare_for_instance_event /opt/stack/nova/nova/compute/manager.py:283}} Aug 30 14:10:51.720468 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Acquiring lock "a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:10:51.721258 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Lock "a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:10:51.721904 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Lock "a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:10:51.723145 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-08-30T14:10:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-33891592',display_name='tempest-VolumesAdminNegativeTest-server-33891592',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-volumesadminnegativetest-server-33891592',id=17,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFNxfMlTgh2AlbGeKCoNjSLqZkOwAUAo8SvV+lBmCfnE0eFq8n65XVY/dOcrTg6YnZH7zyAoMgPMPwvyChNd5HG2vPgUcx8zoLGgorKdVMcmOEgjCJY+qxWvpP1j+79+yQ==',key_name='tempest-keypair-1240208729',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ea4a96dd5eeb416ca49cf98d07944952',ramdisk_id='',reservation_id='r-szs3hgql',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesAdminNegativeTest-416011200',owner_user_name='tempest-VolumesAdminNegativeTest-416011200-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-08-30T14:10:46Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='bf110227fdb94f70af671a2ffb9b4c04',uuid=a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7", "address": "fa:16:3e:24:9e:9b", "network": {"id": "fc683610-55d4-42da-b1c6-1b0e5b095492", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1217128506-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "ea4a96dd5eeb416ca49cf98d07944952", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0cb26a8d-2c", "ovs_interfaceid": "0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Aug 30 14:10:51.723778 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Converting VIF {"id": "0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7", "address": "fa:16:3e:24:9e:9b", "network": {"id": "fc683610-55d4-42da-b1c6-1b0e5b095492", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1217128506-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "ea4a96dd5eeb416ca49cf98d07944952", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0cb26a8d-2c", "ovs_interfaceid": "0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:10:51.725072 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:9e:9b,bridge_name='br-int',has_traffic_filtering=True,id=0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7,network=Network(fc683610-55d4-42da-b1c6-1b0e5b095492),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0cb26a8d-2c') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:10:51.726000 np0035104604 nova-compute[107505]: DEBUG os_vif [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:9e:9b,bridge_name='br-int',has_traffic_filtering=True,id=0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7,network=Network(fc683610-55d4-42da-b1c6-1b0e5b095492),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0cb26a8d-2c') {{(pid=107505) plug /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:76}} Aug 30 14:10:51.727267 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:51.728178 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:10:51.729005 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Aug 30 14:10:51.733173 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:51.733830 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0cb26a8d-2c, may_exist=True, interface_attrs={}) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:10:51.734905 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0cb26a8d-2c, col_values=(('external_ids', {'iface-id': '0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:24:9e:9b', 'vm-uuid': 'a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f'}),), if_exists=True) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:10:51.736773 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:51.741591 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:10:51.744835 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:51.746774 np0035104604 nova-compute[107505]: INFO os_vif [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:9e:9b,bridge_name='br-int',has_traffic_filtering=True,id=0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7,network=Network(fc683610-55d4-42da-b1c6-1b0e5b095492),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0cb26a8d-2c') Aug 30 14:10:51.791302 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] No BDM found with device name vda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:10:51.791816 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] No BDM found with device name hda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:10:51.792214 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] No VIF found with MAC fa:16:3e:24:9e:9b, not building metadata {{(pid=107505) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12068}} Aug 30 14:10:51.793194 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Using config drive Aug 30 14:10:51.821668 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] rbd image a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:10:52.201010 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Creating config drive at /opt/stack/data/nova/instances/a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f/disk.config Aug 30 14:10:52.209011 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Running cmd (subprocess): genisoimage -o /opt/stack/data/nova/instances/a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 27.1.0 -quiet -J -r -V config-2 /tmp/tmp2pmqy89d {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:10:52.248401 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] CMD "genisoimage -o /opt/stack/data/nova/instances/a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 27.1.0 -quiet -J -r -V config-2 /tmp/tmp2pmqy89d" returned: 0 in 0.040s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:10:52.282527 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] rbd image a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:10:52.286568 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f/disk.config a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f_disk.config --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:10:52.418826 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] CMD "rbd import --pool vms /opt/stack/data/nova/instances/a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f/disk.config a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f_disk.config --image-format=2 --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.132s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:10:52.420065 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Deleting local config drive /opt/stack/data/nova/instances/a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f/disk.config because it was imported into RBD. Aug 30 14:10:52.440739 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:52.490774 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:52.758471 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-c6314399-8770-47fe-b5d1-b89f7fd1ca59 req-f621e36f-f9c4-4d59-92a3-18d1d830c363 service nova] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Received event network-vif-plugged-0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:10:52.759220 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-c6314399-8770-47fe-b5d1-b89f7fd1ca59 req-f621e36f-f9c4-4d59-92a3-18d1d830c363 service nova] Acquiring lock "a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:10:52.759820 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-c6314399-8770-47fe-b5d1-b89f7fd1ca59 req-f621e36f-f9c4-4d59-92a3-18d1d830c363 service nova] Lock "a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:10:52.760336 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-c6314399-8770-47fe-b5d1-b89f7fd1ca59 req-f621e36f-f9c4-4d59-92a3-18d1d830c363 service nova] Lock "a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:10:52.760933 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-c6314399-8770-47fe-b5d1-b89f7fd1ca59 req-f621e36f-f9c4-4d59-92a3-18d1d830c363 service nova] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Processing event network-vif-plugged-0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7 {{(pid=107505) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10792}} Aug 30 14:10:52.874038 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:52.882866 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:52.893987 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:53.343683 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Started> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:10:53.344872 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] VM Started (Lifecycle Event) Aug 30 14:10:53.348961 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Instance event wait completed in 0 seconds for network-vif-plugged {{(pid=107505) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Aug 30 14:10:53.361824 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Guest created on hypervisor {{(pid=107505) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4411}} Aug 30 14:10:53.370594 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:10:53.372964 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Instance spawned successfully. Aug 30 14:10:53.374033 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:894}} Aug 30 14:10:53.380748 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:10:53.405034 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Found default for hw_cdrom_bus of ide {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:10:53.405788 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Found default for hw_disk_bus of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:10:53.407916 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Found default for hw_input_bus of None {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:10:53.409055 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Found default for hw_pointer_model of None {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:10:53.410022 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Found default for hw_video_model of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:10:53.410876 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Found default for hw_vif_model of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:10:53.416390 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] During sync_power_state the instance has a pending task (spawning). Skip. Aug 30 14:10:53.417102 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Paused> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:10:53.417490 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] VM Paused (Lifecycle Event) Aug 30 14:10:53.444222 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:10:53.449777 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Resumed> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:10:53.450268 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] VM Resumed (Lifecycle Event) Aug 30 14:10:53.543316 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Took 7.70 seconds to spawn the instance on the hypervisor. Aug 30 14:10:53.544305 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:10:53.547353 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:10:53.571534 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:10:53.603438 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] During sync_power_state the instance has a pending task (spawning). Skip. Aug 30 14:10:53.666211 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Took 10.54 seconds to build instance. Aug 30 14:10:53.685905 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-0e92afa5-445e-4da9-91a1-7b6decf62111 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Lock "a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 10.716s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:10:53.727984 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:54.813385 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-8c3f2aff-4b8b-4f04-9c72-a6fe242ecfeb req-3e7901f6-9fdd-4754-b47a-9ef4705f00e2 service nova] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Received event network-vif-plugged-0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:10:54.814632 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-8c3f2aff-4b8b-4f04-9c72-a6fe242ecfeb req-3e7901f6-9fdd-4754-b47a-9ef4705f00e2 service nova] Acquiring lock "a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:10:54.815166 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-8c3f2aff-4b8b-4f04-9c72-a6fe242ecfeb req-3e7901f6-9fdd-4754-b47a-9ef4705f00e2 service nova] Lock "a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:10:54.815621 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-8c3f2aff-4b8b-4f04-9c72-a6fe242ecfeb req-3e7901f6-9fdd-4754-b47a-9ef4705f00e2 service nova] Lock "a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:10:54.816112 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-8c3f2aff-4b8b-4f04-9c72-a6fe242ecfeb req-3e7901f6-9fdd-4754-b47a-9ef4705f00e2 service nova] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] No waiting events found dispatching network-vif-plugged-0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7 {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:10:54.816759 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [req-8c3f2aff-4b8b-4f04-9c72-a6fe242ecfeb req-3e7901f6-9fdd-4754-b47a-9ef4705f00e2 service nova] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Received unexpected event network-vif-plugged-0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7 for instance with vm_state active and task_state None. Aug 30 14:10:56.339501 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-f0b7da7e-a3ef-4cb2-91c8-b0975a4e39c7 req-5198bd88-be31-4b6a-9d4e-f9604533f7ee service nova] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Received event network-changed-0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:10:56.340860 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-f0b7da7e-a3ef-4cb2-91c8-b0975a4e39c7 req-5198bd88-be31-4b6a-9d4e-f9604533f7ee service nova] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Refreshing instance network info cache due to event network-changed-0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7. {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11037}} Aug 30 14:10:56.341603 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-f0b7da7e-a3ef-4cb2-91c8-b0975a4e39c7 req-5198bd88-be31-4b6a-9d4e-f9604533f7ee service nova] Acquiring lock "refresh_cache-a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:10:56.342061 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-f0b7da7e-a3ef-4cb2-91c8-b0975a4e39c7 req-5198bd88-be31-4b6a-9d4e-f9604533f7ee service nova] Acquired lock "refresh_cache-a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:10:56.342566 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-f0b7da7e-a3ef-4cb2-91c8-b0975a4e39c7 req-5198bd88-be31-4b6a-9d4e-f9604533f7ee service nova] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Refreshing network info cache for port 0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7 {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} Aug 30 14:10:56.737969 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:57.713138 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-f0b7da7e-a3ef-4cb2-91c8-b0975a4e39c7 req-5198bd88-be31-4b6a-9d4e-f9604533f7ee service nova] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Updated VIF entry in instance network info cache for port 0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7. {{(pid=107505) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} Aug 30 14:10:57.713755 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-f0b7da7e-a3ef-4cb2-91c8-b0975a4e39c7 req-5198bd88-be31-4b6a-9d4e-f9604533f7ee service nova] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Updating instance_info_cache with network_info: [{"id": "0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7", "address": "fa:16:3e:24:9e:9b", "network": {"id": "fc683610-55d4-42da-b1c6-1b0e5b095492", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1217128506-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.5.145", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "ea4a96dd5eeb416ca49cf98d07944952", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0cb26a8d-2c", "ovs_interfaceid": "0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:10:57.736108 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-f0b7da7e-a3ef-4cb2-91c8-b0975a4e39c7 req-5198bd88-be31-4b6a-9d4e-f9604533f7ee service nova] Releasing lock "refresh_cache-a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:10:57.885090 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-ff34739e-af66-4909-bea8-8a39060e986a tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Acquiring lock "cd08791f-846f-4204-97a6-eb1701ed0723" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:10:57.885513 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-ff34739e-af66-4909-bea8-8a39060e986a tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Lock "cd08791f-846f-4204-97a6-eb1701ed0723" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:10:57.885859 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-ff34739e-af66-4909-bea8-8a39060e986a tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Acquiring lock "cd08791f-846f-4204-97a6-eb1701ed0723-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:10:57.886571 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-ff34739e-af66-4909-bea8-8a39060e986a tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Lock "cd08791f-846f-4204-97a6-eb1701ed0723-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:10:57.887010 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-ff34739e-af66-4909-bea8-8a39060e986a tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Lock "cd08791f-846f-4204-97a6-eb1701ed0723-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:10:57.890781 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-ff34739e-af66-4909-bea8-8a39060e986a tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Terminating instance Aug 30 14:10:57.893567 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-ff34739e-af66-4909-bea8-8a39060e986a tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Start destroying the instance on the hypervisor. {{(pid=107505) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} Aug 30 14:10:57.948837 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:57.967437 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:57.979072 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:58.012514 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:58.128832 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Instance destroyed successfully. Aug 30 14:10:58.129474 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-ff34739e-af66-4909-bea8-8a39060e986a tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Lazy-loading 'resources' on Instance uuid cd08791f-846f-4204-97a6-eb1701ed0723 {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:10:58.147628 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-ff34739e-af66-4909-bea8-8a39060e986a tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2023-08-30T14:09:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-851385372',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-851385372',ec2_ids=,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=Flavor(34),hidden=False,host='np0035104604',hostname='tempest-serverswithspecificflavortestjson-server-851385372',id=13,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=34,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHHdXbSjxXwqDwe7VjknCsPB7FlaX9kbdidTvee1eM5xK4auSl9TOs3ulw+MzaECXIpkK+ZLwKiI+UH3J46BXA6E7xNkh4IyXHMT0+x8g5UZ8EaR+YM4WD80W1OwwUhUEg==',key_name='tempest-keypair-609321772',keypairs=,launch_index=0,launched_at=2023-08-30T14:09:51Z,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='b35dffa3f90f40559bd146f98ed3083d',ramdisk_id='',reservation_id='r-aqmfaonr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-5414259',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-08-30T14:09:51Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d8043c2265984da88bd2252d8b2cf983',uuid=cd08791f-846f-4204-97a6-eb1701ed0723,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fa0425cc-6d23-4e9d-90d9-c26582dd918a", "address": "fa:16:3e:86:83:a5", "network": {"id": "33008139-2fd9-496e-ac17-91db64b6f4c6", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1161556951-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.5.10", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b35dffa3f90f40559bd146f98ed3083d", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa0425cc-6d", "ovs_interfaceid": "fa0425cc-6d23-4e9d-90d9-c26582dd918a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Aug 30 14:10:58.148055 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-ff34739e-af66-4909-bea8-8a39060e986a tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Converting VIF {"id": "fa0425cc-6d23-4e9d-90d9-c26582dd918a", "address": "fa:16:3e:86:83:a5", "network": {"id": "33008139-2fd9-496e-ac17-91db64b6f4c6", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1161556951-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.5.10", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "b35dffa3f90f40559bd146f98ed3083d", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa0425cc-6d", "ovs_interfaceid": "fa0425cc-6d23-4e9d-90d9-c26582dd918a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:10:58.149063 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-ff34739e-af66-4909-bea8-8a39060e986a tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:86:83:a5,bridge_name='br-int',has_traffic_filtering=True,id=fa0425cc-6d23-4e9d-90d9-c26582dd918a,network=Network(33008139-2fd9-496e-ac17-91db64b6f4c6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa0425cc-6d') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:10:58.149633 np0035104604 nova-compute[107505]: DEBUG os_vif [None req-ff34739e-af66-4909-bea8-8a39060e986a tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:86:83:a5,bridge_name='br-int',has_traffic_filtering=True,id=fa0425cc-6d23-4e9d-90d9-c26582dd918a,network=Network(33008139-2fd9-496e-ac17-91db64b6f4c6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa0425cc-6d') {{(pid=107505) unplug /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:109}} Aug 30 14:10:58.152185 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:58.152674 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfa0425cc-6d, bridge=br-int, if_exists=True) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:10:58.155037 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:58.158974 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:10:58.162857 np0035104604 nova-compute[107505]: INFO os_vif [None req-ff34739e-af66-4909-bea8-8a39060e986a tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:86:83:a5,bridge_name='br-int',has_traffic_filtering=True,id=fa0425cc-6d23-4e9d-90d9-c26582dd918a,network=Network(33008139-2fd9-496e-ac17-91db64b6f4c6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa0425cc-6d') Aug 30 14:10:58.424336 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-f82b8cb3-e333-40e2-992b-95d54b504a84 req-a6e2d4b6-582c-4d1f-9b5c-a19238d6eda0 service nova] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Received event network-vif-unplugged-fa0425cc-6d23-4e9d-90d9-c26582dd918a {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:10:58.424689 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-f82b8cb3-e333-40e2-992b-95d54b504a84 req-a6e2d4b6-582c-4d1f-9b5c-a19238d6eda0 service nova] Acquiring lock "cd08791f-846f-4204-97a6-eb1701ed0723-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:10:58.425443 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-f82b8cb3-e333-40e2-992b-95d54b504a84 req-a6e2d4b6-582c-4d1f-9b5c-a19238d6eda0 service nova] Lock "cd08791f-846f-4204-97a6-eb1701ed0723-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:10:58.425964 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-f82b8cb3-e333-40e2-992b-95d54b504a84 req-a6e2d4b6-582c-4d1f-9b5c-a19238d6eda0 service nova] Lock "cd08791f-846f-4204-97a6-eb1701ed0723-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:10:58.426706 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-f82b8cb3-e333-40e2-992b-95d54b504a84 req-a6e2d4b6-582c-4d1f-9b5c-a19238d6eda0 service nova] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] No waiting events found dispatching network-vif-unplugged-fa0425cc-6d23-4e9d-90d9-c26582dd918a {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:10:58.427274 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-f82b8cb3-e333-40e2-992b-95d54b504a84 req-a6e2d4b6-582c-4d1f-9b5c-a19238d6eda0 service nova] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Received event network-vif-unplugged-fa0425cc-6d23-4e9d-90d9-c26582dd918a for instance with task_state deleting. {{(pid=107505) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10810}} Aug 30 14:10:58.427724 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-f82b8cb3-e333-40e2-992b-95d54b504a84 req-a6e2d4b6-582c-4d1f-9b5c-a19238d6eda0 service nova] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Received event network-vif-plugged-fa0425cc-6d23-4e9d-90d9-c26582dd918a {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:10:58.428208 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-f82b8cb3-e333-40e2-992b-95d54b504a84 req-a6e2d4b6-582c-4d1f-9b5c-a19238d6eda0 service nova] Acquiring lock "cd08791f-846f-4204-97a6-eb1701ed0723-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:10:58.428725 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-f82b8cb3-e333-40e2-992b-95d54b504a84 req-a6e2d4b6-582c-4d1f-9b5c-a19238d6eda0 service nova] Lock "cd08791f-846f-4204-97a6-eb1701ed0723-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:10:58.428996 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-f82b8cb3-e333-40e2-992b-95d54b504a84 req-a6e2d4b6-582c-4d1f-9b5c-a19238d6eda0 service nova] Lock "cd08791f-846f-4204-97a6-eb1701ed0723-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:10:58.429272 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-f82b8cb3-e333-40e2-992b-95d54b504a84 req-a6e2d4b6-582c-4d1f-9b5c-a19238d6eda0 service nova] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] No waiting events found dispatching network-vif-plugged-fa0425cc-6d23-4e9d-90d9-c26582dd918a {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:10:58.429703 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [req-f82b8cb3-e333-40e2-992b-95d54b504a84 req-a6e2d4b6-582c-4d1f-9b5c-a19238d6eda0 service nova] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Received unexpected event network-vif-plugged-fa0425cc-6d23-4e9d-90d9-c26582dd918a for instance with vm_state active and task_state deleting. Aug 30 14:10:58.676812 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-ff34739e-af66-4909-bea8-8a39060e986a tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Deleting instance files /opt/stack/data/nova/instances/cd08791f-846f-4204-97a6-eb1701ed0723_del Aug 30 14:10:58.677420 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-ff34739e-af66-4909-bea8-8a39060e986a tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Deletion of /opt/stack/data/nova/instances/cd08791f-846f-4204-97a6-eb1701ed0723_del complete Aug 30 14:10:58.729694 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:10:58.755094 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-ff34739e-af66-4909-bea8-8a39060e986a tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Took 0.86 seconds to destroy the instance on the hypervisor. Aug 30 14:10:58.755727 np0035104604 nova-compute[107505]: DEBUG oslo.service.loopingcall [None req-ff34739e-af66-4909-bea8-8a39060e986a tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=107505) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} Aug 30 14:10:58.756148 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [-] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Deallocating network for instance {{(pid=107505) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} Aug 30 14:10:58.756387 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [-] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] deallocate_for_instance() {{(pid=107505) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} Aug 30 14:10:59.832715 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [-] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Updating instance_info_cache with network_info: [] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:10:59.854196 np0035104604 nova-compute[107505]: INFO nova.compute.manager [-] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Took 1.10 seconds to deallocate network for instance. Aug 30 14:10:59.939001 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-b30d9ea5-bed2-4189-be64-c183099cce67 req-648de658-57fe-4b46-9f3c-4a4ac190c127 service nova] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Received event network-vif-deleted-fa0425cc-6d23-4e9d-90d9-c26582dd918a {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:10:59.945938 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-ff34739e-af66-4909-bea8-8a39060e986a tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:10:59.946599 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-ff34739e-af66-4909-bea8-8a39060e986a tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:11:00.553331 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-ff34739e-af66-4909-bea8-8a39060e986a tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:11:01.279471 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-ff34739e-af66-4909-bea8-8a39060e986a tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.726s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:11:01.286611 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-ff34739e-af66-4909-bea8-8a39060e986a tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:11:01.310913 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-ff34739e-af66-4909-bea8-8a39060e986a tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:11:01.349838 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-ff34739e-af66-4909-bea8-8a39060e986a tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.403s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:11:01.526315 np0035104604 nova-compute[107505]: INFO nova.scheduler.client.report [None req-ff34739e-af66-4909-bea8-8a39060e986a tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Deleted allocations for instance cd08791f-846f-4204-97a6-eb1701ed0723 Aug 30 14:11:01.650718 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-ff34739e-af66-4909-bea8-8a39060e986a tempest-ServersWithSpecificFlavorTestJSON-5414259 tempest-ServersWithSpecificFlavorTestJSON-5414259-project-member] Lock "cd08791f-846f-4204-97a6-eb1701ed0723" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 3.765s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:11:03.333139 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:11:03.731606 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:11:06.581258 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:11:08.157453 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:11:08.734299 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:11:10.026004 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:11:10.026645 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=107505) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10461}} Aug 30 14:11:10.027177 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager.update_available_resource {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:11:10.049305 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:11:10.049784 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:11:10.050200 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:11:10.050703 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Auditing locally available compute resources for np0035104604 (node: np0035104604) {{(pid=107505) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} Aug 30 14:11:10.051298 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:11:13.126978 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:11:13.127715 np0035104604 nova-compute[107505]: INFO nova.compute.manager [-] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] VM Stopped (Lifecycle Event) Aug 30 14:11:13.149406 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-1514ccb8-8ad8-4a9e-ae70-69cbb40e0d32 None None] [instance: cd08791f-846f-4204-97a6-eb1701ed0723] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:11:13.158793 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:11:13.736097 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:11:18.161420 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:11:23.163511 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:11:23.739015 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:11:28.165491 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:11:28.740939 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:11:31.708682 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 21.657s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:11:31.786880 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000011 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:11:31.787165 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000011 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:11:31.791109 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000010 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:11:31.791409 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000010 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:11:31.874561 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Aug 30 14:11:31.876287 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Hypervisor/Node resource view: name=np0035104604 free_ram=1448MB free_disk=29.954235076904297GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "00b8", "vendor_id": "1013", "numa_node": null, "label": "label_1013_00b8", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] {{(pid=107505) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} Aug 30 14:11:31.876524 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:11:31.876908 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:11:32.219678 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance d3e215ac-b5f4-4d63-a3b7-22c9c3720570 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:11:32.219956 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:11:32.220260 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Total usable vcpus: 8, total allocated vcpus: 2 {{(pid=107505) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} Aug 30 14:11:32.220571 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Final resource view: name=np0035104604 phys_ram=7746MB used_ram=768MB phys_disk=29GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] {{(pid=107505) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} Aug 30 14:11:32.441773 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Refreshing inventories for resource provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} Aug 30 14:11:32.589379 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Updating ProviderTree inventory for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} Aug 30 14:11:32.589803 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Updating inventory in ProviderTree for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Aug 30 14:11:32.776921 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Refreshing aggregate associations for resource provider 600ab55f-530c-4be6-bf02-067d68ce7ee4, aggregates: None {{(pid=107505) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} Aug 30 14:11:32.933673 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:11:32.955230 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Refreshing trait associations for resource provider 600ab55f-530c-4be6-bf02-067d68ce7ee4, traits: COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_QXL,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE42,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_LAN9118,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_RESCUE_BFV,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VMVGA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_SMMUV3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SCSI {{(pid=107505) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} Aug 30 14:11:33.265434 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:11:33.480400 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:11:33.743213 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:11:34.055430 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.575s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:11:34.061782 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:11:34.080174 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:11:34.114060 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Compute_service record updated for np0035104604:np0035104604 {{(pid=107505) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} Aug 30 14:11:34.114648 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.238s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:11:34.115661 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:11:34.116048 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Cleaning up deleted instances {{(pid=107505) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11129}} Aug 30 14:11:34.134358 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] There are 0 instances to clean {{(pid=107505) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11138}} Aug 30 14:11:34.135361 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:11:34.135740 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Cleaning up deleted instances with incomplete migration {{(pid=107505) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11167}} Aug 30 14:11:34.152817 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:11:35.166107 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:11:35.166962 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:11:35.166962 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:11:35.167299 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Starting heal instance info cache {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9842}} Aug 30 14:11:35.167683 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Rebuilding the list of instances to heal {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9846}} Aug 30 14:11:35.360908 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "refresh_cache-d3e215ac-b5f4-4d63-a3b7-22c9c3720570" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:11:35.361600 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquired lock "refresh_cache-d3e215ac-b5f4-4d63-a3b7-22c9c3720570" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:11:35.361801 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Forcefully refreshing network info cache for instance {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1996}} Aug 30 14:11:35.362181 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lazy-loading 'info_cache' on Instance uuid d3e215ac-b5f4-4d63-a3b7-22c9c3720570 {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:11:36.695963 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Updating instance_info_cache with network_info: [{"id": "09224fee-8a35-4c18-b3c4-8c55dc653c62", "address": "fa:16:3e:26:70:26", "network": {"id": "9a6c7309-6d00-4e31-ab3a-4b23c80f5c66", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1690993988-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.5.35", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "13d6c819bc1543bbb5481c22adf548e8", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap09224fee-8a", "ovs_interfaceid": "09224fee-8a35-4c18-b3c4-8c55dc653c62", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:11:36.712919 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Releasing lock "refresh_cache-d3e215ac-b5f4-4d63-a3b7-22c9c3720570" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:11:36.713338 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Updated the network info_cache for instance {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} Aug 30 14:11:36.714613 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:11:36.715031 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:11:36.715560 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:11:36.715902 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:11:36.716254 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:11:38.267019 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:11:38.539896 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Acquiring lock "5db380b6-b80a-4cb1-b65c-571bfe9b4341" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:11:38.540441 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Lock "5db380b6-b80a-4cb1-b65c-571bfe9b4341" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:11:38.558390 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] [instance: 5db380b6-b80a-4cb1-b65c-571bfe9b4341] Starting instance... {{(pid=107505) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} Aug 30 14:11:38.782237 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:11:38.782722 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:11:38.791591 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=107505) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Aug 30 14:11:38.792188 np0035104604 nova-compute[107505]: INFO nova.compute.claims [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] [instance: 5db380b6-b80a-4cb1-b65c-571bfe9b4341] Claim successful on node np0035104604 Aug 30 14:11:39.499042 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:11:40.081461 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.582s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:11:40.088666 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:11:40.105761 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:11:40.139787 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.357s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:11:40.141001 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] [instance: 5db380b6-b80a-4cb1-b65c-571bfe9b4341] Start building networks asynchronously for instance. {{(pid=107505) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} Aug 30 14:11:40.208601 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] [instance: 5db380b6-b80a-4cb1-b65c-571bfe9b4341] Allocating IP information in the background. {{(pid=107505) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} Aug 30 14:11:40.209095 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] [instance: 5db380b6-b80a-4cb1-b65c-571bfe9b4341] allocate_for_instance() {{(pid=107505) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} Aug 30 14:11:40.346593 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] [instance: 5db380b6-b80a-4cb1-b65c-571bfe9b4341] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Aug 30 14:11:40.369473 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] [instance: 5db380b6-b80a-4cb1-b65c-571bfe9b4341] Start building block device mappings for instance. {{(pid=107505) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} Aug 30 14:11:40.603418 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] [instance: 5db380b6-b80a-4cb1-b65c-571bfe9b4341] Start spawning the instance on the hypervisor. {{(pid=107505) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} Aug 30 14:11:40.604402 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] [instance: 5db380b6-b80a-4cb1-b65c-571bfe9b4341] Creating instance directory {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4717}} Aug 30 14:11:40.604883 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] [instance: 5db380b6-b80a-4cb1-b65c-571bfe9b4341] Creating image(s) Aug 30 14:11:40.640202 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] rbd image 5db380b6-b80a-4cb1-b65c-571bfe9b4341_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:11:40.670781 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] rbd image 5db380b6-b80a-4cb1-b65c-571bfe9b4341_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:11:40.703867 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] rbd image 5db380b6-b80a-4cb1-b65c-571bfe9b4341_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:11:40.707899 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e --force-share --output=json {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:11:40.739826 np0035104604 nova-compute[107505]: DEBUG nova.policy [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '31659ce5f29944eab1ef53566e15c84f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4a37d1371f13469f8d99c0897c9e3375', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=107505) authorize /opt/stack/nova/nova/policy.py:203}} Aug 30 14:11:40.866816 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._sync_power_states {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:11:40.870131 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] CMD "/opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e --force-share --output=json" returned: 0 in 0.162s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:11:40.871824 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Acquiring lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:11:40.872517 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:11:40.872875 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:11:40.900275 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] rbd image 5db380b6-b80a-4cb1-b65c-571bfe9b4341_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:11:40.905604 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e 5db380b6-b80a-4cb1-b65c-571bfe9b4341_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:11:40.943712 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] While synchronizing instance power states, found 3 instances in the database and 2 instances on the hypervisor. Aug 30 14:11:40.944170 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Triggering sync for uuid d3e215ac-b5f4-4d63-a3b7-22c9c3720570 {{(pid=107505) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10252}} Aug 30 14:11:40.944847 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Triggering sync for uuid a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f {{(pid=107505) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10252}} Aug 30 14:11:40.945341 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Triggering sync for uuid 5db380b6-b80a-4cb1-b65c-571bfe9b4341 {{(pid=107505) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10252}} Aug 30 14:11:40.946349 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "d3e215ac-b5f4-4d63-a3b7-22c9c3720570" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:11:40.947012 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "d3e215ac-b5f4-4d63-a3b7-22c9c3720570" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:11:40.948956 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:11:40.949469 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:11:40.950058 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "5db380b6-b80a-4cb1-b65c-571bfe9b4341" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:11:41.009582 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "d3e215ac-b5f4-4d63-a3b7-22c9c3720570" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.063s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:11:41.011009 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.062s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:11:41.216593 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] CMD "rbd import --pool vms /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e 5db380b6-b80a-4cb1-b65c-571bfe9b4341_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.311s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:11:41.321473 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] resizing rbd image 5db380b6-b80a-4cb1-b65c-571bfe9b4341_disk to 1073741824 {{(pid=107505) resize /opt/stack/nova/nova/storage/rbd_utils.py:288}} Aug 30 14:11:41.436216 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] [instance: 5db380b6-b80a-4cb1-b65c-571bfe9b4341] Created local disks {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4851}} Aug 30 14:11:41.436707 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] [instance: 5db380b6-b80a-4cb1-b65c-571bfe9b4341] Ensure instance console log exists: /opt/stack/data/nova/instances/5db380b6-b80a-4cb1-b65c-571bfe9b4341/console.log {{(pid=107505) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4603}} Aug 30 14:11:41.437389 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:11:41.437858 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:11:41.438225 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:11:41.588450 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] [instance: 5db380b6-b80a-4cb1-b65c-571bfe9b4341] Successfully created port: e431a7d7-312d-451e-939f-4716633bee2c {{(pid=107505) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} Aug 30 14:11:42.350885 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] [instance: 5db380b6-b80a-4cb1-b65c-571bfe9b4341] Successfully updated port: e431a7d7-312d-451e-939f-4716633bee2c {{(pid=107505) _update_port /opt/stack/nova/nova/network/neutron.py:586}} Aug 30 14:11:42.367016 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Acquiring lock "refresh_cache-5db380b6-b80a-4cb1-b65c-571bfe9b4341" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:11:42.367356 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Acquired lock "refresh_cache-5db380b6-b80a-4cb1-b65c-571bfe9b4341" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:11:42.367784 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] [instance: 5db380b6-b80a-4cb1-b65c-571bfe9b4341] Building network info cache for instance {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} Aug 30 14:11:42.515471 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-2a93cad5-7e62-4e18-8aed-0abbd91e5842 req-71b08fe8-8620-432b-bb49-976203ace5f5 service nova] [instance: 5db380b6-b80a-4cb1-b65c-571bfe9b4341] Received event network-changed-e431a7d7-312d-451e-939f-4716633bee2c {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:11:42.515837 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-2a93cad5-7e62-4e18-8aed-0abbd91e5842 req-71b08fe8-8620-432b-bb49-976203ace5f5 service nova] [instance: 5db380b6-b80a-4cb1-b65c-571bfe9b4341] Refreshing instance network info cache due to event network-changed-e431a7d7-312d-451e-939f-4716633bee2c. {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11037}} Aug 30 14:11:42.516084 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-2a93cad5-7e62-4e18-8aed-0abbd91e5842 req-71b08fe8-8620-432b-bb49-976203ace5f5 service nova] Acquiring lock "refresh_cache-5db380b6-b80a-4cb1-b65c-571bfe9b4341" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:11:42.556630 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] [instance: 5db380b6-b80a-4cb1-b65c-571bfe9b4341] Instance cache missing network info. {{(pid=107505) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} Aug 30 14:11:43.378778 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:11:43.444859 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] [instance: 5db380b6-b80a-4cb1-b65c-571bfe9b4341] Updating instance_info_cache with network_info: [{"id": "e431a7d7-312d-451e-939f-4716633bee2c", "address": "fa:16:3e:65:d8:46", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.152", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape431a7d7-31", "ovs_interfaceid": "e431a7d7-312d-451e-939f-4716633bee2c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:11:43.469065 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Releasing lock "refresh_cache-5db380b6-b80a-4cb1-b65c-571bfe9b4341" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:11:43.470294 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] [instance: 5db380b6-b80a-4cb1-b65c-571bfe9b4341] Instance network_info: |[{"id": "e431a7d7-312d-451e-939f-4716633bee2c", "address": "fa:16:3e:65:d8:46", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.152", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape431a7d7-31", "ovs_interfaceid": "e431a7d7-312d-451e-939f-4716633bee2c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=107505) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} Aug 30 14:11:43.471139 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-2a93cad5-7e62-4e18-8aed-0abbd91e5842 req-71b08fe8-8620-432b-bb49-976203ace5f5 service nova] Acquired lock "refresh_cache-5db380b6-b80a-4cb1-b65c-571bfe9b4341" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:11:43.471587 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-2a93cad5-7e62-4e18-8aed-0abbd91e5842 req-71b08fe8-8620-432b-bb49-976203ace5f5 service nova] [instance: 5db380b6-b80a-4cb1-b65c-571bfe9b4341] Refreshing network info cache for port e431a7d7-312d-451e-939f-4716633bee2c {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} Aug 30 14:11:43.475443 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] [instance: 5db380b6-b80a-4cb1-b65c-571bfe9b4341] Start _get_guest_xml network_info=[{"id": "e431a7d7-312d-451e-939f-4716633bee2c", "address": "fa:16:3e:65:d8:46", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.152", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape431a7d7-31", "ovs_interfaceid": "e431a7d7-312d-451e-939f-4716633bee2c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'ide', 'dev': 'hda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-08-30T14:01:32Z,direct_url=,disk_format='qcow2',id=a32d892d-d3ba-4b7e-9bab-7e06f730b9e0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='7f1ed4be4a61468fa9c662d2b5fbcb56',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-08-30T14:01:43Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'disk_bus': 'virtio', 'image_id': 'a32d892d-d3ba-4b7e-9bab-7e06f730b9e0'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=107505) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7553}} Aug 30 14:11:43.480025 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Aug 30 14:11:43.483683 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Searching host: 'np0035104604' for CPU controller through CGroups V1... {{(pid=107505) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1650}} Aug 30 14:11:43.484467 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] CPU controller missing on host. {{(pid=107505) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1660}} Aug 30 14:11:43.603389 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Searching host: 'np0035104604' for CPU controller through CGroups V2... {{(pid=107505) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1669}} Aug 30 14:11:43.603731 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] CPU controller found on host. {{(pid=107505) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1676}} Aug 30 14:11:43.605196 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=107505) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5390}} Aug 30 14:11:43.605815 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-08-30T14:02:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-08-30T14:01:32Z,direct_url=,disk_format='qcow2',id=a32d892d-d3ba-4b7e-9bab-7e06f730b9e0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='7f1ed4be4a61468fa9c662d2b5fbcb56',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-08-30T14:01:43Z,virtual_size=,visibility=), allow threads: True {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Aug 30 14:11:43.606149 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Flavor limits 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Aug 30 14:11:43.606397 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Image limits 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Aug 30 14:11:43.606734 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Flavor pref 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Aug 30 14:11:43.607075 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Image pref 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Aug 30 14:11:43.607480 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Aug 30 14:11:43.607974 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Aug 30 14:11:43.608164 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=107505) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Aug 30 14:11:43.608477 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Got 1 possible topologies {{(pid=107505) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Aug 30 14:11:43.608753 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Aug 30 14:11:43.609057 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Aug 30 14:11:43.623131 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:11:44.274447 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.651s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:11:44.307607 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] rbd image 5db380b6-b80a-4cb1-b65c-571bfe9b4341_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:11:44.311524 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:11:44.818481 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-2a93cad5-7e62-4e18-8aed-0abbd91e5842 req-71b08fe8-8620-432b-bb49-976203ace5f5 service nova] [instance: 5db380b6-b80a-4cb1-b65c-571bfe9b4341] Updated VIF entry in instance network info cache for port e431a7d7-312d-451e-939f-4716633bee2c. {{(pid=107505) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} Aug 30 14:11:44.819251 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-2a93cad5-7e62-4e18-8aed-0abbd91e5842 req-71b08fe8-8620-432b-bb49-976203ace5f5 service nova] [instance: 5db380b6-b80a-4cb1-b65c-571bfe9b4341] Updating instance_info_cache with network_info: [{"id": "e431a7d7-312d-451e-939f-4716633bee2c", "address": "fa:16:3e:65:d8:46", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.152", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape431a7d7-31", "ovs_interfaceid": "e431a7d7-312d-451e-939f-4716633bee2c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:11:44.837608 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-2a93cad5-7e62-4e18-8aed-0abbd91e5842 req-71b08fe8-8620-432b-bb49-976203ace5f5 service nova] Releasing lock "refresh_cache-5db380b6-b80a-4cb1-b65c-571bfe9b4341" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:11:45.049524 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.738s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:11:45.051997 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-08-30T14:11:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersAdminTestJSON-server-671269412',display_name='tempest-DeleteServersAdminTestJSON-server-671269412',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-deleteserversadmintestjson-server-671269412',id=18,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4a37d1371f13469f8d99c0897c9e3375',ramdisk_id='',reservation_id='r-g0pi63of',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-DeleteServersAdminTestJSON-895773713',owner_user_name='tempest-DeleteServersAdminTestJSON-895773713-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-08-30T14:11:40Z,user_data=None,user_id='31659ce5f29944eab1ef53566e15c84f',uuid=5db380b6-b80a-4cb1-b65c-571bfe9b4341,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e431a7d7-312d-451e-939f-4716633bee2c", "address": "fa:16:3e:65:d8:46", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.152", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape431a7d7-31", "ovs_interfaceid": "e431a7d7-312d-451e-939f-4716633bee2c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=qemu {{(pid=107505) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Aug 30 14:11:45.052451 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Converting VIF {"id": "e431a7d7-312d-451e-939f-4716633bee2c", "address": "fa:16:3e:65:d8:46", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.152", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape431a7d7-31", "ovs_interfaceid": "e431a7d7-312d-451e-939f-4716633bee2c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:11:45.053804 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:65:d8:46,bridge_name='br-int',has_traffic_filtering=True,id=e431a7d7-312d-451e-939f-4716633bee2c,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape431a7d7-31') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:11:45.055410 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Lazy-loading 'pci_devices' on Instance uuid 5db380b6-b80a-4cb1-b65c-571bfe9b4341 {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] [instance: 5db380b6-b80a-4cb1-b65c-571bfe9b4341] End _get_guest_xml xml= Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: 5db380b6-b80a-4cb1-b65c-571bfe9b4341 Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: instance-00000012 Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: 131072 Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: 1 Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: tempest-DeleteServersAdminTestJSON-server-671269412 Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: 2023-08-30 14:11:43 Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: 128 Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: 1 Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: 0 Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: 0 Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: 1 Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: tempest-DeleteServersAdminTestJSON-895773713-project-member Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: tempest-DeleteServersAdminTestJSON-895773713 Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: OpenStack Foundation Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: OpenStack Nova Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: 27.1.0 Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: 5db380b6-b80a-4cb1-b65c-571bfe9b4341 Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: 5db380b6-b80a-4cb1-b65c-571bfe9b4341 Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Virtual Machine Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: hvm Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Nehalem Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: /dev/urandom Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: Aug 30 14:11:45.071665 np0035104604 nova-compute[107505]: {{(pid=107505) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7559}} Aug 30 14:11:45.083569 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] [instance: 5db380b6-b80a-4cb1-b65c-571bfe9b4341] Preparing to wait for external event network-vif-plugged-e431a7d7-312d-451e-939f-4716633bee2c {{(pid=107505) prepare_for_instance_event /opt/stack/nova/nova/compute/manager.py:283}} Aug 30 14:11:45.083569 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Acquiring lock "5db380b6-b80a-4cb1-b65c-571bfe9b4341-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:11:45.083569 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Lock "5db380b6-b80a-4cb1-b65c-571bfe9b4341-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:11:45.083569 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Lock "5db380b6-b80a-4cb1-b65c-571bfe9b4341-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:11:45.083569 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-08-30T14:11:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersAdminTestJSON-server-671269412',display_name='tempest-DeleteServersAdminTestJSON-server-671269412',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-deleteserversadmintestjson-server-671269412',id=18,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4a37d1371f13469f8d99c0897c9e3375',ramdisk_id='',reservation_id='r-g0pi63of',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-DeleteServersAdminTestJSON-895773713',owner_user_name='tempest-DeleteServersAdminTestJSON-895773713-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-08-30T14:11:40Z,user_data=None,user_id='31659ce5f29944eab1ef53566e15c84f',uuid=5db380b6-b80a-4cb1-b65c-571bfe9b4341,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e431a7d7-312d-451e-939f-4716633bee2c", "address": "fa:16:3e:65:d8:46", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.152", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape431a7d7-31", "ovs_interfaceid": "e431a7d7-312d-451e-939f-4716633bee2c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Aug 30 14:11:45.084182 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Converting VIF {"id": "e431a7d7-312d-451e-939f-4716633bee2c", "address": "fa:16:3e:65:d8:46", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.152", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape431a7d7-31", "ovs_interfaceid": "e431a7d7-312d-451e-939f-4716633bee2c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:11:45.084182 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:65:d8:46,bridge_name='br-int',has_traffic_filtering=True,id=e431a7d7-312d-451e-939f-4716633bee2c,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape431a7d7-31') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:11:45.084182 np0035104604 nova-compute[107505]: DEBUG os_vif [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:d8:46,bridge_name='br-int',has_traffic_filtering=True,id=e431a7d7-312d-451e-939f-4716633bee2c,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape431a7d7-31') {{(pid=107505) plug /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:76}} Aug 30 14:11:45.084779 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:11:45.085595 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:11:45.086566 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Aug 30 14:11:45.091047 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:11:45.091420 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape431a7d7-31, may_exist=True, interface_attrs={}) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:11:45.092078 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape431a7d7-31, col_values=(('external_ids', {'iface-id': 'e431a7d7-312d-451e-939f-4716633bee2c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:65:d8:46', 'vm-uuid': '5db380b6-b80a-4cb1-b65c-571bfe9b4341'}),), if_exists=True) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:11:45.093965 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:11:45.097748 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:11:45.101433 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:11:45.103128 np0035104604 nova-compute[107505]: INFO os_vif [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:d8:46,bridge_name='br-int',has_traffic_filtering=True,id=e431a7d7-312d-451e-939f-4716633bee2c,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape431a7d7-31') Aug 30 14:11:45.142974 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] No BDM found with device name vda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:11:45.143448 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] No BDM found with device name hda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:11:45.143850 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] No VIF found with MAC fa:16:3e:65:d8:46, not building metadata {{(pid=107505) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12068}} Aug 30 14:11:45.144788 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] [instance: 5db380b6-b80a-4cb1-b65c-571bfe9b4341] Using config drive Aug 30 14:11:45.171354 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] rbd image 5db380b6-b80a-4cb1-b65c-571bfe9b4341_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:11:45.532844 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] [instance: 5db380b6-b80a-4cb1-b65c-571bfe9b4341] Creating config drive at /opt/stack/data/nova/instances/5db380b6-b80a-4cb1-b65c-571bfe9b4341/disk.config Aug 30 14:11:45.537421 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Running cmd (subprocess): genisoimage -o /opt/stack/data/nova/instances/5db380b6-b80a-4cb1-b65c-571bfe9b4341/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 27.1.0 -quiet -J -r -V config-2 /tmp/tmpitzd8_t6 {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:11:45.571478 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] CMD "genisoimage -o /opt/stack/data/nova/instances/5db380b6-b80a-4cb1-b65c-571bfe9b4341/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 27.1.0 -quiet -J -r -V config-2 /tmp/tmpitzd8_t6" returned: 0 in 0.034s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:11:45.601594 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] rbd image 5db380b6-b80a-4cb1-b65c-571bfe9b4341_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:11:45.605126 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-4b14e909-ab82-4f29-a763-c2514c3f7355 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/5db380b6-b80a-4cb1-b65c-571bfe9b4341/disk.config 5db380b6-b80a-4cb1-b65c-571bfe9b4341_disk.config --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:11:50.097222 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:11:53.748571 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:11:55.101976 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:11:58.749748 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:12:00.104898 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:12:03.750737 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:12:05.106696 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:12:08.751724 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:12:10.027345 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:12:10.027892 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=107505) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10461}} Aug 30 14:12:10.028605 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager.update_available_resource {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:12:10.051665 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:12:10.052698 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:12:10.053393 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:12:10.054099 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Auditing locally available compute resources for np0035104604 (node: np0035104604) {{(pid=107505) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} Aug 30 14:12:10.055109 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:12:10.108358 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:12:10.625487 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.570s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:12:10.695475 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000011 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:12:10.695976 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000011 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:12:10.699072 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000012 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:12:10.699446 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000012 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:12:10.702284 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000010 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:12:10.702826 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000010 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:12:10.765126 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Aug 30 14:12:10.766814 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Hypervisor/Node resource view: name=np0035104604 free_ram=1459MB free_disk=29.919174194335938GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "00b8", "vendor_id": "1013", "numa_node": null, "label": "label_1013_00b8", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] {{(pid=107505) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} Aug 30 14:12:10.767339 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:12:10.767931 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:12:10.959732 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance d3e215ac-b5f4-4d63-a3b7-22c9c3720570 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:12:10.960075 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:12:10.960373 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance 5db380b6-b80a-4cb1-b65c-571bfe9b4341 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:12:10.960772 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Total usable vcpus: 8, total allocated vcpus: 3 {{(pid=107505) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} Aug 30 14:12:10.961107 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Final resource view: name=np0035104604 phys_ram=7746MB used_ram=896MB phys_disk=29GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] {{(pid=107505) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} Aug 30 14:12:11.566140 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:12:12.205633 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.639s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:12:12.212207 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:12:12.225713 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:12:12.259414 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Compute_service record updated for np0035104604:np0035104604 {{(pid=107505) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} Aug 30 14:12:12.259735 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.492s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:12:15.111767 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:12:15.260083 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:12:15.263605 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Starting heal instance info cache {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9842}} Aug 30 14:12:15.478256 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "refresh_cache-a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:12:15.478771 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquired lock "refresh_cache-a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:12:15.479246 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Forcefully refreshing network info cache for instance {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1996}} Aug 30 14:12:16.818450 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Updating instance_info_cache with network_info: [{"id": "0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7", "address": "fa:16:3e:24:9e:9b", "network": {"id": "fc683610-55d4-42da-b1c6-1b0e5b095492", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1217128506-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.5.145", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "ea4a96dd5eeb416ca49cf98d07944952", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0cb26a8d-2c", "ovs_interfaceid": "0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:12:16.834197 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Releasing lock "refresh_cache-a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:12:16.834828 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Updated the network info_cache for instance {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} Aug 30 14:12:16.835481 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:12:16.835755 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:12:16.836033 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:12:17.026265 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:12:17.026917 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:12:17.027399 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:12:18.755835 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:12:20.115677 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:12:23.757404 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:12:25.119217 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:12:30.121250 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:12:30.122786 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:12:30.123150 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=107505) run /opt/stack/data/venv/lib/python3.10/site-packages/ovs/reconnect.py:117}} Aug 30 14:12:30.123439 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=107505) _transition /opt/stack/data/venv/lib/python3.10/site-packages/ovs/reconnect.py:519}} Aug 30 14:12:30.124546 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=107505) _transition /opt/stack/data/venv/lib/python3.10/site-packages/ovs/reconnect.py:519}} Aug 30 14:12:30.126407 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:12:33.760447 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:12:35.125945 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:12:40.127893 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:12:45.129361 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:12:45.130934 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:12:45.131241 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=107505) run /opt/stack/data/venv/lib/python3.10/site-packages/ovs/reconnect.py:117}} Aug 30 14:12:45.131527 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=107505) _transition /opt/stack/data/venv/lib/python3.10/site-packages/ovs/reconnect.py:519}} Aug 30 14:12:45.132210 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=107505) _transition /opt/stack/data/venv/lib/python3.10/site-packages/ovs/reconnect.py:519}} Aug 30 14:12:45.132818 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:12:50.133948 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:12:55.135845 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:12:55.137181 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:12:55.137437 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=107505) run /opt/stack/data/venv/lib/python3.10/site-packages/ovs/reconnect.py:117}} Aug 30 14:12:55.137704 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=107505) _transition /opt/stack/data/venv/lib/python3.10/site-packages/ovs/reconnect.py:519}} Aug 30 14:12:55.138447 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=107505) _transition /opt/stack/data/venv/lib/python3.10/site-packages/ovs/reconnect.py:519}} Aug 30 14:12:55.139072 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:12:58.766694 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:13:00.140054 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:13:03.768426 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:13:05.142210 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:13:08.770521 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:13:10.027985 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager.update_available_resource {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:13:10.063449 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:13:10.063980 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:13:10.064424 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:13:10.064867 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Auditing locally available compute resources for np0035104604 (node: np0035104604) {{(pid=107505) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} Aug 30 14:13:10.065529 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:13:10.145321 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:13:10.662959 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.597s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:13:10.736221 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000011 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:13:10.736576 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000011 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:13:10.739959 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000012 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:13:10.740249 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000012 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:13:10.743355 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000010 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:13:10.743647 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000010 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:13:10.806440 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Aug 30 14:13:10.807863 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Hypervisor/Node resource view: name=np0035104604 free_ram=1442MB free_disk=29.919174194335938GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "00b8", "vendor_id": "1013", "numa_node": null, "label": "label_1013_00b8", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] {{(pid=107505) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} Aug 30 14:13:10.808118 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:13:10.808495 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:13:11.016646 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance d3e215ac-b5f4-4d63-a3b7-22c9c3720570 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:13:11.017130 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:13:11.017440 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance 5db380b6-b80a-4cb1-b65c-571bfe9b4341 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:13:11.017885 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Total usable vcpus: 8, total allocated vcpus: 3 {{(pid=107505) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} Aug 30 14:13:11.018191 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Final resource view: name=np0035104604 phys_ram=7746MB used_ram=896MB phys_disk=29GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] {{(pid=107505) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} Aug 30 14:13:11.553720 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:13:12.181646 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.628s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:13:12.187890 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:13:12.204005 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:13:12.207880 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Compute_service record updated for np0035104604:np0035104604 {{(pid=107505) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} Aug 30 14:13:12.208219 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.400s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:13:13.207974 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:13:13.208971 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:13:13.208971 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=107505) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10461}} Aug 30 14:13:13.773098 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:13:14.026634 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:13:14.026868 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Starting heal instance info cache {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9842}} Aug 30 14:13:14.027032 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Rebuilding the list of instances to heal {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9846}} Aug 30 14:13:14.047588 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: 5db380b6-b80a-4cb1-b65c-571bfe9b4341] Skipping network cache update for instance because it is Building. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9855}} Aug 30 14:13:14.199036 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "refresh_cache-d3e215ac-b5f4-4d63-a3b7-22c9c3720570" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:13:14.199402 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquired lock "refresh_cache-d3e215ac-b5f4-4d63-a3b7-22c9c3720570" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:13:14.199862 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Forcefully refreshing network info cache for instance {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1996}} Aug 30 14:13:14.200384 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lazy-loading 'info_cache' on Instance uuid d3e215ac-b5f4-4d63-a3b7-22c9c3720570 {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:13:15.206988 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:13:15.478984 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Updating instance_info_cache with network_info: [{"id": "09224fee-8a35-4c18-b3c4-8c55dc653c62", "address": "fa:16:3e:26:70:26", "network": {"id": "9a6c7309-6d00-4e31-ab3a-4b23c80f5c66", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1690993988-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.5.35", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "13d6c819bc1543bbb5481c22adf548e8", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap09224fee-8a", "ovs_interfaceid": "09224fee-8a35-4c18-b3c4-8c55dc653c62", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:13:15.498014 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Releasing lock "refresh_cache-d3e215ac-b5f4-4d63-a3b7-22c9c3720570" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:13:15.498458 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Updated the network info_cache for instance {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} Aug 30 14:13:15.499237 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:13:16.025370 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:13:16.025861 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:13:17.025664 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:13:18.774077 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:13:19.018997 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:13:19.025271 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:13:20.209997 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:13:25.211500 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:13:25.213193 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:13:25.213474 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=107505) run /opt/stack/data/venv/lib/python3.10/site-packages/ovs/reconnect.py:117}} Aug 30 14:13:25.213677 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=107505) _transition /opt/stack/data/venv/lib/python3.10/site-packages/ovs/reconnect.py:519}} Aug 30 14:13:25.214321 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=107505) _transition /opt/stack/data/venv/lib/python3.10/site-packages/ovs/reconnect.py:519}} Aug 30 14:13:25.214964 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:13:28.775030 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:13:30.216832 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:13:33.776148 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:13:35.219676 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:13:40.221543 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:13:45.223283 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:13:45.224498 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:13:45.224764 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=107505) run /opt/stack/data/venv/lib/python3.10/site-packages/ovs/reconnect.py:117}} Aug 30 14:13:45.224982 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=107505) _transition /opt/stack/data/venv/lib/python3.10/site-packages/ovs/reconnect.py:519}} Aug 30 14:13:45.225543 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=107505) _transition /opt/stack/data/venv/lib/python3.10/site-packages/ovs/reconnect.py:519}} Aug 30 14:13:45.226052 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:13:50.228626 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:13:55.232740 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:14:00.236014 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:14:00.237300 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:14:00.237664 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=107505) run /opt/stack/data/venv/lib/python3.10/site-packages/ovs/reconnect.py:117}} Aug 30 14:14:00.238045 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=107505) _transition /opt/stack/data/venv/lib/python3.10/site-packages/ovs/reconnect.py:519}} Aug 30 14:14:00.239024 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=107505) _transition /opt/stack/data/venv/lib/python3.10/site-packages/ovs/reconnect.py:519}} Aug 30 14:14:00.239801 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:14:03.780985 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:14:05.240461 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:14:08.782160 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:14:10.026875 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager.update_available_resource {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:14:10.048973 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:14:10.049485 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:14:10.049946 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:14:10.050355 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Auditing locally available compute resources for np0035104604 (node: np0035104604) {{(pid=107505) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} Aug 30 14:14:10.051021 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:14:10.244583 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:14:10.589604 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.538s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:14:10.670665 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000011 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:14:10.671195 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000011 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:14:10.675694 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000012 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:14:10.676046 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000012 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:14:10.679432 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000010 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:14:10.679647 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000010 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:14:10.749387 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Aug 30 14:14:10.750643 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Hypervisor/Node resource view: name=np0035104604 free_ram=1446MB free_disk=29.919174194335938GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "00b8", "vendor_id": "1013", "numa_node": null, "label": "label_1013_00b8", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] {{(pid=107505) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} Aug 30 14:14:10.751038 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:14:10.751264 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:14:10.938080 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance d3e215ac-b5f4-4d63-a3b7-22c9c3720570 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:14:10.938384 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:14:10.938672 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance 5db380b6-b80a-4cb1-b65c-571bfe9b4341 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:14:10.939124 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Total usable vcpus: 8, total allocated vcpus: 3 {{(pid=107505) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} Aug 30 14:14:10.939614 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Final resource view: name=np0035104604 phys_ram=7746MB used_ram=896MB phys_disk=29GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] {{(pid=107505) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} Aug 30 14:14:11.503702 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:14:12.106035 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.601s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:14:12.114876 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:14:12.134167 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:14:12.142842 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Compute_service record updated for np0035104604:np0035104604 {{(pid=107505) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} Aug 30 14:14:12.143309 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.392s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:14:13.143748 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:14:13.144414 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=107505) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10461}} Aug 30 14:14:13.784256 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:14:14.028076 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:14:14.028589 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Starting heal instance info cache {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9842}} Aug 30 14:14:14.217070 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "refresh_cache-a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:14:14.217471 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquired lock "refresh_cache-a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:14:14.217558 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Forcefully refreshing network info cache for instance {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1996}} Aug 30 14:14:15.249146 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:14:15.474789 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Updating instance_info_cache with network_info: [{"id": "0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7", "address": "fa:16:3e:24:9e:9b", "network": {"id": "fc683610-55d4-42da-b1c6-1b0e5b095492", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1217128506-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.5.145", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "ea4a96dd5eeb416ca49cf98d07944952", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0cb26a8d-2c", "ovs_interfaceid": "0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:14:15.489591 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Releasing lock "refresh_cache-a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:14:15.490043 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Updated the network info_cache for instance {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} Aug 30 14:14:15.490635 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:14:17.026176 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:14:18.026798 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:14:18.786039 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:14:19.017979 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:14:19.025330 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:14:19.025808 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:14:20.252548 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:14:23.789499 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:14:25.257012 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:14:30.258410 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:14:33.791680 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:14:35.260147 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:14:38.792423 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:14:40.262969 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:14:45.265270 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:14:50.268596 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:14:53.491597 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-3569f36e-4b57-4dd6-803c-83e65e151198 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Acquiring lock "5db380b6-b80a-4cb1-b65c-571bfe9b4341" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:14:55.272576 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:14:55.273310 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:14:55.273464 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=107505) run /opt/stack/data/venv/lib/python3.10/site-packages/ovs/reconnect.py:117}} Aug 30 14:14:55.273816 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=107505) _transition /opt/stack/data/venv/lib/python3.10/site-packages/ovs/reconnect.py:519}} Aug 30 14:14:55.274745 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=107505) _transition /opt/stack/data/venv/lib/python3.10/site-packages/ovs/reconnect.py:519}} Aug 30 14:14:55.276515 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:14:59.605136 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-25a6f74d-80f4-4720-8b81-8859e0d3e46b tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Get console output Aug 30 14:14:59.612814 np0035104604 nova-compute[107505]: INFO oslo.privsep.daemon [None req-25a6f74d-80f4-4720-8b81-8859e0d3e46b tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova-cpu.conf', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpu7iao6ye/privsep.sock'] Aug 30 14:14:59.636109 np0035104604 sudo[129938]: stack : PWD=/ ; USER=root ; COMMAND=/opt/stack/data/venv/bin/nova-rootwrap /etc/nova/rootwrap.conf privsep-helper --config-file /etc/nova/nova-cpu.conf --privsep_context nova.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpu7iao6ye/privsep.sock Aug 30 14:14:59.637140 np0035104604 sudo[129938]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1001) Aug 30 14:15:00.275432 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:15:00.277344 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:15:00.277521 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=107505) run /opt/stack/data/venv/lib/python3.10/site-packages/ovs/reconnect.py:117}} Aug 30 14:15:00.277676 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=107505) _transition /opt/stack/data/venv/lib/python3.10/site-packages/ovs/reconnect.py:519}} Aug 30 14:15:00.278314 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=107505) _transition /opt/stack/data/venv/lib/python3.10/site-packages/ovs/reconnect.py:519}} Aug 30 14:15:00.279911 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:15:01.143760 np0035104604 sudo[129938]: pam_unix(sudo:session): session closed for user root Aug 30 14:15:01.147332 np0035104604 nova-compute[107505]: INFO oslo.privsep.daemon [None req-25a6f74d-80f4-4720-8b81-8859e0d3e46b tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Spawned new privsep daemon via rootwrap Aug 30 14:15:01.149578 np0035104604 nova-compute[107505]: INFO oslo.privsep.daemon [-] privsep daemon starting Aug 30 14:15:01.150142 np0035104604 nova-compute[107505]: INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0 Aug 30 14:15:01.150509 np0035104604 nova-compute[107505]: INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none Aug 30 14:15:01.150781 np0035104604 nova-compute[107505]: INFO oslo.privsep.daemon [-] privsep daemon running as pid 129941 Aug 30 14:15:01.292279 np0035104604 nova-compute[107505]: INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes Aug 30 14:15:03.798566 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:15:05.280014 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:15:07.250131 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-86ca6f55-af9e-4fa3-931e-36e7bfcc380c tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Get console output Aug 30 14:15:07.255965 np0035104604 nova-compute[107505]: INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes Aug 30 14:15:07.399113 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-2fa09150-3f38-4405-85a1-96ba618eb4b5 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Acquiring lock "a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:15:07.400295 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-2fa09150-3f38-4405-85a1-96ba618eb4b5 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Lock "a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:15:07.400765 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-2fa09150-3f38-4405-85a1-96ba618eb4b5 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Acquiring lock "a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:15:07.401171 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-2fa09150-3f38-4405-85a1-96ba618eb4b5 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Lock "a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:15:07.401740 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-2fa09150-3f38-4405-85a1-96ba618eb4b5 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Lock "a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:15:07.408506 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-2fa09150-3f38-4405-85a1-96ba618eb4b5 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Terminating instance Aug 30 14:15:07.413695 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-2fa09150-3f38-4405-85a1-96ba618eb4b5 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Start destroying the instance on the hypervisor. {{(pid=107505) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} Aug 30 14:15:08.800853 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:15:10.025581 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager.update_available_resource {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:15:10.047851 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:15:10.048293 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:15:10.048741 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:15:10.049196 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Auditing locally available compute resources for np0035104604 (node: np0035104604) {{(pid=107505) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} Aug 30 14:15:10.049692 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:15:10.283284 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:15:10.735893 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.686s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:15:13.805167 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:15:15.286628 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:15:17.459754 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:15:17.477333 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:15:17.496446 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:15:17.515297 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:15:17.516163 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:15:17.668885 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Instance destroyed successfully. Aug 30 14:15:17.669692 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-2fa09150-3f38-4405-85a1-96ba618eb4b5 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Lazy-loading 'resources' on Instance uuid a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:15:17.694977 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-2fa09150-3f38-4405-85a1-96ba618eb4b5 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2023-08-30T14:10:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-33891592',display_name='tempest-VolumesAdminNegativeTest-server-33891592',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-volumesadminnegativetest-server-33891592',id=17,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFNxfMlTgh2AlbGeKCoNjSLqZkOwAUAo8SvV+lBmCfnE0eFq8n65XVY/dOcrTg6YnZH7zyAoMgPMPwvyChNd5HG2vPgUcx8zoLGgorKdVMcmOEgjCJY+qxWvpP1j+79+yQ==',key_name='tempest-keypair-1240208729',keypairs=,launch_index=0,launched_at=2023-08-30T14:10:53Z,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='ea4a96dd5eeb416ca49cf98d07944952',ramdisk_id='',reservation_id='r-szs3hgql',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-VolumesAdminNegativeTest-416011200',owner_user_name='tempest-VolumesAdminNegativeTest-416011200-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-08-30T14:10:54Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='bf110227fdb94f70af671a2ffb9b4c04',uuid=a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7", "address": "fa:16:3e:24:9e:9b", "network": {"id": "fc683610-55d4-42da-b1c6-1b0e5b095492", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1217128506-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.5.145", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "ea4a96dd5eeb416ca49cf98d07944952", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0cb26a8d-2c", "ovs_interfaceid": "0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Aug 30 14:15:17.695421 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-2fa09150-3f38-4405-85a1-96ba618eb4b5 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Converting VIF {"id": "0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7", "address": "fa:16:3e:24:9e:9b", "network": {"id": "fc683610-55d4-42da-b1c6-1b0e5b095492", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1217128506-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.5.145", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "ea4a96dd5eeb416ca49cf98d07944952", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0cb26a8d-2c", "ovs_interfaceid": "0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:15:17.697153 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-2fa09150-3f38-4405-85a1-96ba618eb4b5 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:24:9e:9b,bridge_name='br-int',has_traffic_filtering=True,id=0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7,network=Network(fc683610-55d4-42da-b1c6-1b0e5b095492),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0cb26a8d-2c') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:15:17.697998 np0035104604 nova-compute[107505]: DEBUG os_vif [None req-2fa09150-3f38-4405-85a1-96ba618eb4b5 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:24:9e:9b,bridge_name='br-int',has_traffic_filtering=True,id=0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7,network=Network(fc683610-55d4-42da-b1c6-1b0e5b095492),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0cb26a8d-2c') {{(pid=107505) unplug /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:109}} Aug 30 14:15:17.701869 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:15:17.702505 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0cb26a8d-2c, bridge=br-int, if_exists=True) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:15:17.705508 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:15:17.708803 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:15:17.718561 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:15:17.731816 np0035104604 nova-compute[107505]: INFO os_vif [None req-2fa09150-3f38-4405-85a1-96ba618eb4b5 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:24:9e:9b,bridge_name='br-int',has_traffic_filtering=True,id=0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7,network=Network(fc683610-55d4-42da-b1c6-1b0e5b095492),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0cb26a8d-2c') Aug 30 14:15:17.768715 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-962aacc4-73c4-4430-8ffc-a004024a9458 req-8d3cc7f4-fcb1-4065-b23f-4de715929fbf service nova] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Received event network-vif-unplugged-0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:15:17.769253 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-962aacc4-73c4-4430-8ffc-a004024a9458 req-8d3cc7f4-fcb1-4065-b23f-4de715929fbf service nova] Acquiring lock "a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:15:17.769799 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-962aacc4-73c4-4430-8ffc-a004024a9458 req-8d3cc7f4-fcb1-4065-b23f-4de715929fbf service nova] Lock "a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:15:17.770231 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-962aacc4-73c4-4430-8ffc-a004024a9458 req-8d3cc7f4-fcb1-4065-b23f-4de715929fbf service nova] Lock "a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:15:17.770716 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-962aacc4-73c4-4430-8ffc-a004024a9458 req-8d3cc7f4-fcb1-4065-b23f-4de715929fbf service nova] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] No waiting events found dispatching network-vif-unplugged-0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7 {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:15:17.771141 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-962aacc4-73c4-4430-8ffc-a004024a9458 req-8d3cc7f4-fcb1-4065-b23f-4de715929fbf service nova] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Received event network-vif-unplugged-0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7 for instance with task_state deleting. {{(pid=107505) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10810}} Aug 30 14:15:17.802032 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000011 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:15:17.802389 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000011 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:15:17.808163 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000012 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:15:17.808512 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000012 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:15:17.817873 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000010 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:15:17.818137 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000010 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:15:17.901211 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Aug 30 14:15:17.903105 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Hypervisor/Node resource view: name=np0035104604 free_ram=1582MB free_disk=29.919174194335938GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "00b8", "vendor_id": "1013", "numa_node": null, "label": "label_1013_00b8", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] {{(pid=107505) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} Aug 30 14:15:17.903517 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:15:17.904057 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:15:17.984385 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:15:18.186403 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance d3e215ac-b5f4-4d63-a3b7-22c9c3720570 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:15:18.189138 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:15:18.189631 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance 5db380b6-b80a-4cb1-b65c-571bfe9b4341 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:15:18.190195 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Total usable vcpus: 8, total allocated vcpus: 3 {{(pid=107505) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} Aug 30 14:15:18.190829 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Final resource view: name=np0035104604 phys_ram=7746MB used_ram=896MB phys_disk=29GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] {{(pid=107505) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} Aug 30 14:15:18.816424 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:15:18.839264 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:15:19.422445 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.583s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:15:19.429459 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:15:19.449030 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:15:19.483908 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Compute_service record updated for np0035104604:np0035104604 {{(pid=107505) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} Aug 30 14:15:19.484346 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.580s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:15:19.810410 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-c5380c6f-5abf-4fb4-9f06-d2105a816d69 req-c4c46935-c185-4c03-bf50-c16b77a848a2 service nova] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Received event network-vif-plugged-0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:15:19.811051 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-c5380c6f-5abf-4fb4-9f06-d2105a816d69 req-c4c46935-c185-4c03-bf50-c16b77a848a2 service nova] Acquiring lock "a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:15:19.811857 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-c5380c6f-5abf-4fb4-9f06-d2105a816d69 req-c4c46935-c185-4c03-bf50-c16b77a848a2 service nova] Lock "a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:15:19.812452 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-c5380c6f-5abf-4fb4-9f06-d2105a816d69 req-c4c46935-c185-4c03-bf50-c16b77a848a2 service nova] Lock "a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:15:19.812951 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-c5380c6f-5abf-4fb4-9f06-d2105a816d69 req-c4c46935-c185-4c03-bf50-c16b77a848a2 service nova] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] No waiting events found dispatching network-vif-plugged-0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7 {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:15:19.813715 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [req-c5380c6f-5abf-4fb4-9f06-d2105a816d69 req-c4c46935-c185-4c03-bf50-c16b77a848a2 service nova] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Received unexpected event network-vif-plugged-0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7 for instance with vm_state active and task_state deleting. Aug 30 14:15:22.485311 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:15:22.485920 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:15:22.486322 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:15:22.486707 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Starting heal instance info cache {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9842}} Aug 30 14:15:22.487053 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Rebuilding the list of instances to heal {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9846}} Aug 30 14:15:22.512892 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Skipping network cache update for instance because it is being deleted. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9859}} Aug 30 14:15:22.513260 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: 5db380b6-b80a-4cb1-b65c-571bfe9b4341] Skipping network cache update for instance because it is Building. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9855}} Aug 30 14:15:22.705467 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:15:22.713579 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "refresh_cache-d3e215ac-b5f4-4d63-a3b7-22c9c3720570" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:15:22.713829 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquired lock "refresh_cache-d3e215ac-b5f4-4d63-a3b7-22c9c3720570" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:15:22.714262 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Forcefully refreshing network info cache for instance {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1996}} Aug 30 14:15:22.714673 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lazy-loading 'info_cache' on Instance uuid d3e215ac-b5f4-4d63-a3b7-22c9c3720570 {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:15:23.908327 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Updating instance_info_cache with network_info: [{"id": "09224fee-8a35-4c18-b3c4-8c55dc653c62", "address": "fa:16:3e:26:70:26", "network": {"id": "9a6c7309-6d00-4e31-ab3a-4b23c80f5c66", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1690993988-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.5.35", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "13d6c819bc1543bbb5481c22adf548e8", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap09224fee-8a", "ovs_interfaceid": "09224fee-8a35-4c18-b3c4-8c55dc653c62", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:15:23.927397 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Releasing lock "refresh_cache-d3e215ac-b5f4-4d63-a3b7-22c9c3720570" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:15:23.927397 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Updated the network info_cache for instance {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} Aug 30 14:15:23.927397 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:15:23.927397 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:15:23.927397 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:15:23.927397 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:15:23.927397 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:15:23.927397 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:15:23.927397 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=107505) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10461}} Aug 30 14:15:27.706975 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:15:27.828926 np0035104604 nova-compute[107505]: WARNING nova.storage.rbd_utils [-] rbd remove a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f_disk in pool vms failed: rbd.ImageBusy: [errno 16] RBD image is busy (error removing image) Aug 30 14:15:27.829229 np0035104604 nova-compute[107505]: WARNING oslo.service.loopingcall [-] Function 'nova.storage.rbd_utils.RBDDriver._destroy_volume.._cleanup_vol' run outlasted interval by 5.05 sec Aug 30 14:15:32.665079 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:15:32.665675 np0035104604 nova-compute[107505]: INFO nova.compute.manager [-] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] VM Stopped (Lifecycle Event) Aug 30 14:15:32.687590 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-8f54127b-1679-4179-ae15-21175c602a91 None None] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:15:32.692291 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-8f54127b-1679-4179-ae15-21175c602a91 None None] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: active, current task_state: deleting, current DB power_state: 1, VM power_state: 4 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:15:32.708423 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:15:32.713211 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-8f54127b-1679-4179-ae15-21175c602a91 None None] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] During sync_power_state the instance has a pending task (deleting). Skip. Aug 30 14:15:33.807731 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:15:37.710604 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:15:37.867863 np0035104604 nova-compute[107505]: WARNING nova.storage.rbd_utils [-] rbd remove a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f_disk in pool vms failed: rbd.ImageBusy: [errno 16] RBD image is busy (error removing image) Aug 30 14:15:37.868592 np0035104604 nova-compute[107505]: WARNING oslo.service.loopingcall [-] Function 'nova.storage.rbd_utils.RBDDriver._destroy_volume.._cleanup_vol' run outlasted interval by 5.04 sec Aug 30 14:15:38.809229 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:15:42.713151 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:15:42.823340 np0035104604 nova-compute[107505]: WARNING nova.storage.rbd_utils [-] rbd remove a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f_disk in pool vms failed: rbd.ImageBusy: [errno 16] RBD image is busy (error removing image) Aug 30 14:15:43.200376 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-2fa09150-3f38-4405-85a1-96ba618eb4b5 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Deleting instance files /opt/stack/data/nova/instances/a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f_del Aug 30 14:15:43.201677 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-2fa09150-3f38-4405-85a1-96ba618eb4b5 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Deletion of /opt/stack/data/nova/instances/a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f_del complete Aug 30 14:15:43.274226 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-2fa09150-3f38-4405-85a1-96ba618eb4b5 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Took 35.86 seconds to destroy the instance on the hypervisor. Aug 30 14:15:43.275018 np0035104604 nova-compute[107505]: DEBUG oslo.service.loopingcall [None req-2fa09150-3f38-4405-85a1-96ba618eb4b5 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=107505) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} Aug 30 14:15:43.275517 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [-] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Deallocating network for instance {{(pid=107505) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} Aug 30 14:15:43.276003 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [-] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] deallocate_for_instance() {{(pid=107505) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} Aug 30 14:15:44.326088 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [-] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Updating instance_info_cache with network_info: [] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:15:44.347318 np0035104604 nova-compute[107505]: INFO nova.compute.manager [-] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Took 1.07 seconds to deallocate network for instance. Aug 30 14:15:44.376230 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-33ad7770-0f36-49ee-b667-f20495acb321 req-e00cfdf2-9081-4ec4-a34f-f72e001ab4a3 service nova] [instance: a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f] Received event network-vif-deleted-0cb26a8d-2cf7-499a-a6fe-7191c7bf3eb7 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:15:44.417963 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-2fa09150-3f38-4405-85a1-96ba618eb4b5 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:15:44.418442 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-2fa09150-3f38-4405-85a1-96ba618eb4b5 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:15:45.038223 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-2fa09150-3f38-4405-85a1-96ba618eb4b5 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:15:45.655112 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-2fa09150-3f38-4405-85a1-96ba618eb4b5 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.617s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:15:45.661401 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-2fa09150-3f38-4405-85a1-96ba618eb4b5 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:15:45.677771 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-2fa09150-3f38-4405-85a1-96ba618eb4b5 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:15:45.712904 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-2fa09150-3f38-4405-85a1-96ba618eb4b5 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.294s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:15:45.869275 np0035104604 nova-compute[107505]: INFO nova.scheduler.client.report [None req-2fa09150-3f38-4405-85a1-96ba618eb4b5 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Deleted allocations for instance a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f Aug 30 14:15:45.968412 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-2fa09150-3f38-4405-85a1-96ba618eb4b5 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Lock "a64ae60a-e61d-4c6e-b98d-4a75c9c0ab1f" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 38.568s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:15:47.716151 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:15:48.699373 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-2b52bc81-a01c-47a4-b93a-1080c6bd232c tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Acquiring lock "f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:15:48.699905 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-2b52bc81-a01c-47a4-b93a-1080c6bd232c tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Lock "f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:15:48.715719 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-2b52bc81-a01c-47a4-b93a-1080c6bd232c tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75] Starting instance... {{(pid=107505) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} Aug 30 14:15:48.928810 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-2b52bc81-a01c-47a4-b93a-1080c6bd232c tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:15:48.929609 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-2b52bc81-a01c-47a4-b93a-1080c6bd232c tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:15:48.934464 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-2b52bc81-a01c-47a4-b93a-1080c6bd232c tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=107505) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Aug 30 14:15:48.935094 np0035104604 nova-compute[107505]: INFO nova.compute.claims [None req-2b52bc81-a01c-47a4-b93a-1080c6bd232c tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75] Claim successful on node np0035104604 Aug 30 14:15:49.644853 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-2b52bc81-a01c-47a4-b93a-1080c6bd232c tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:15:50.368421 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-2b52bc81-a01c-47a4-b93a-1080c6bd232c tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.723s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:15:50.375107 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-2b52bc81-a01c-47a4-b93a-1080c6bd232c tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:15:50.390710 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-2b52bc81-a01c-47a4-b93a-1080c6bd232c tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:15:50.427525 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-2b52bc81-a01c-47a4-b93a-1080c6bd232c tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.498s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:15:50.428270 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-2b52bc81-a01c-47a4-b93a-1080c6bd232c tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75] Start building networks asynchronously for instance. {{(pid=107505) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} Aug 30 14:15:50.512470 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-2b52bc81-a01c-47a4-b93a-1080c6bd232c tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75] Allocating IP information in the background. {{(pid=107505) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} Aug 30 14:15:50.512789 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-2b52bc81-a01c-47a4-b93a-1080c6bd232c tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75] allocate_for_instance() {{(pid=107505) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} Aug 30 14:15:50.638019 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-2b52bc81-a01c-47a4-b93a-1080c6bd232c tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Aug 30 14:15:50.661903 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-2b52bc81-a01c-47a4-b93a-1080c6bd232c tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75] Start building block device mappings for instance. {{(pid=107505) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} Aug 30 14:15:50.774200 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-2b52bc81-a01c-47a4-b93a-1080c6bd232c tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75] Start spawning the instance on the hypervisor. {{(pid=107505) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} Aug 30 14:15:50.775186 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-2b52bc81-a01c-47a4-b93a-1080c6bd232c tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75] Creating instance directory {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4717}} Aug 30 14:15:50.775668 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-2b52bc81-a01c-47a4-b93a-1080c6bd232c tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75] Creating image(s) Aug 30 14:15:50.803902 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-2b52bc81-a01c-47a4-b93a-1080c6bd232c tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] rbd image f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:15:50.830887 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-2b52bc81-a01c-47a4-b93a-1080c6bd232c tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] rbd image f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:15:50.861953 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-2b52bc81-a01c-47a4-b93a-1080c6bd232c tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] rbd image f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:15:50.866431 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-2b52bc81-a01c-47a4-b93a-1080c6bd232c tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e --force-share --output=json {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:15:50.892983 np0035104604 nova-compute[107505]: DEBUG nova.policy [None req-2b52bc81-a01c-47a4-b93a-1080c6bd232c tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bf110227fdb94f70af671a2ffb9b4c04', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ea4a96dd5eeb416ca49cf98d07944952', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=107505) authorize /opt/stack/nova/nova/policy.py:203}} Aug 30 14:15:51.017470 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-2b52bc81-a01c-47a4-b93a-1080c6bd232c tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] CMD "/opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e --force-share --output=json" returned: 0 in 0.151s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:15:51.018544 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-2b52bc81-a01c-47a4-b93a-1080c6bd232c tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Acquiring lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:15:51.019456 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-2b52bc81-a01c-47a4-b93a-1080c6bd232c tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:15:51.019930 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-2b52bc81-a01c-47a4-b93a-1080c6bd232c tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:15:51.047534 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-2b52bc81-a01c-47a4-b93a-1080c6bd232c tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] rbd image f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:15:51.052073 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-2b52bc81-a01c-47a4-b93a-1080c6bd232c tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:15:51.713631 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-2b52bc81-a01c-47a4-b93a-1080c6bd232c tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75] Successfully created port: 7b66f848-162a-44d1-90c0-ca1d8a73f3d5 {{(pid=107505) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} Aug 30 14:15:52.559149 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-2b52bc81-a01c-47a4-b93a-1080c6bd232c tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75] Successfully updated port: 7b66f848-162a-44d1-90c0-ca1d8a73f3d5 {{(pid=107505) _update_port /opt/stack/nova/nova/network/neutron.py:586}} Aug 30 14:15:52.580849 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-2b52bc81-a01c-47a4-b93a-1080c6bd232c tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Acquiring lock "refresh_cache-f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:15:52.580849 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-2b52bc81-a01c-47a4-b93a-1080c6bd232c tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Acquired lock "refresh_cache-f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:15:52.581263 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-2b52bc81-a01c-47a4-b93a-1080c6bd232c tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75] Building network info cache for instance {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} Aug 30 14:15:52.759912 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-e240ded9-ce42-4fe3-a68e-ee7b6989ed6b req-84d02701-be3d-4b31-bc08-20684f7adfdb service nova] [instance: f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75] Received event network-changed-7b66f848-162a-44d1-90c0-ca1d8a73f3d5 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:15:52.760731 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-e240ded9-ce42-4fe3-a68e-ee7b6989ed6b req-84d02701-be3d-4b31-bc08-20684f7adfdb service nova] [instance: f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75] Refreshing instance network info cache due to event network-changed-7b66f848-162a-44d1-90c0-ca1d8a73f3d5. {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11037}} Aug 30 14:15:52.760892 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-e240ded9-ce42-4fe3-a68e-ee7b6989ed6b req-84d02701-be3d-4b31-bc08-20684f7adfdb service nova] Acquiring lock "refresh_cache-f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:15:52.766092 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:15:52.815078 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-2b52bc81-a01c-47a4-b93a-1080c6bd232c tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75] Instance cache missing network info. {{(pid=107505) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} Aug 30 14:15:53.705805 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-2b52bc81-a01c-47a4-b93a-1080c6bd232c tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75] Updating instance_info_cache with network_info: [{"id": "7b66f848-162a-44d1-90c0-ca1d8a73f3d5", "address": "fa:16:3e:bc:f5:e7", "network": {"id": "fc683610-55d4-42da-b1c6-1b0e5b095492", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1217128506-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "ea4a96dd5eeb416ca49cf98d07944952", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b66f848-16", "ovs_interfaceid": "7b66f848-162a-44d1-90c0-ca1d8a73f3d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:15:53.727756 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-2b52bc81-a01c-47a4-b93a-1080c6bd232c tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Releasing lock "refresh_cache-f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:15:53.728110 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-2b52bc81-a01c-47a4-b93a-1080c6bd232c tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] [instance: f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75] Instance network_info: |[{"id": "7b66f848-162a-44d1-90c0-ca1d8a73f3d5", "address": "fa:16:3e:bc:f5:e7", "network": {"id": "fc683610-55d4-42da-b1c6-1b0e5b095492", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1217128506-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "ea4a96dd5eeb416ca49cf98d07944952", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b66f848-16", "ovs_interfaceid": "7b66f848-162a-44d1-90c0-ca1d8a73f3d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=107505) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} Aug 30 14:15:53.729119 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-e240ded9-ce42-4fe3-a68e-ee7b6989ed6b req-84d02701-be3d-4b31-bc08-20684f7adfdb service nova] Acquired lock "refresh_cache-f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:15:53.729394 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-e240ded9-ce42-4fe3-a68e-ee7b6989ed6b req-84d02701-be3d-4b31-bc08-20684f7adfdb service nova] [instance: f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75] Refreshing network info cache for port 7b66f848-162a-44d1-90c0-ca1d8a73f3d5 {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} Aug 30 14:15:54.995592 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-e240ded9-ce42-4fe3-a68e-ee7b6989ed6b req-84d02701-be3d-4b31-bc08-20684f7adfdb service nova] [instance: f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75] Updated VIF entry in instance network info cache for port 7b66f848-162a-44d1-90c0-ca1d8a73f3d5. {{(pid=107505) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} Aug 30 14:15:54.997036 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-e240ded9-ce42-4fe3-a68e-ee7b6989ed6b req-84d02701-be3d-4b31-bc08-20684f7adfdb service nova] [instance: f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75] Updating instance_info_cache with network_info: [{"id": "7b66f848-162a-44d1-90c0-ca1d8a73f3d5", "address": "fa:16:3e:bc:f5:e7", "network": {"id": "fc683610-55d4-42da-b1c6-1b0e5b095492", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1217128506-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "ea4a96dd5eeb416ca49cf98d07944952", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b66f848-16", "ovs_interfaceid": "7b66f848-162a-44d1-90c0-ca1d8a73f3d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:15:55.022972 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-e240ded9-ce42-4fe3-a68e-ee7b6989ed6b req-84d02701-be3d-4b31-bc08-20684f7adfdb service nova] Releasing lock "refresh_cache-f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:15:57.768052 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:16:02.769172 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:16:07.770021 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:16:08.816163 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:16:11.026302 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager.update_available_resource {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:16:11.053774 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:16:11.054420 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:16:11.055066 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:16:11.055648 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Auditing locally available compute resources for np0035104604 (node: np0035104604) {{(pid=107505) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} Aug 30 14:16:11.056470 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:16:11.728494 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.671s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:16:11.813717 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000012 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:16:11.814194 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000012 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:16:11.820002 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000010 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:16:11.820667 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000010 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:16:11.886503 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Aug 30 14:16:11.888128 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Hypervisor/Node resource view: name=np0035104604 free_ram=1594MB free_disk=29.94009780883789GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "00b8", "vendor_id": "1013", "numa_node": null, "label": "label_1013_00b8", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] {{(pid=107505) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} Aug 30 14:16:11.888754 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:16:11.889218 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:16:12.104666 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance d3e215ac-b5f4-4d63-a3b7-22c9c3720570 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:16:12.105118 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance 5db380b6-b80a-4cb1-b65c-571bfe9b4341 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:16:12.105118 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:16:12.105429 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Total usable vcpus: 8, total allocated vcpus: 3 {{(pid=107505) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} Aug 30 14:16:12.105684 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Final resource view: name=np0035104604 phys_ram=7746MB used_ram=896MB phys_disk=29GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] {{(pid=107505) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} Aug 30 14:16:12.721004 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:16:12.772648 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:16:13.375120 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.654s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:16:13.383174 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:16:13.405577 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:16:13.458931 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Compute_service record updated for np0035104604:np0035104604 {{(pid=107505) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} Aug 30 14:16:13.459529 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.570s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:16:13.817170 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:16:14.026575 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:16:15.036304 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:16:15.036998 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Starting heal instance info cache {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9842}} Aug 30 14:16:15.036998 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Rebuilding the list of instances to heal {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9846}} Aug 30 14:16:15.066814 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: 5db380b6-b80a-4cb1-b65c-571bfe9b4341] Skipping network cache update for instance because it is Building. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9855}} Aug 30 14:16:15.067209 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75] Skipping network cache update for instance because it is Building. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9855}} Aug 30 14:16:15.251046 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "refresh_cache-d3e215ac-b5f4-4d63-a3b7-22c9c3720570" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:16:15.251628 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquired lock "refresh_cache-d3e215ac-b5f4-4d63-a3b7-22c9c3720570" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:16:15.252134 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Forcefully refreshing network info cache for instance {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1996}} Aug 30 14:16:15.252637 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lazy-loading 'info_cache' on Instance uuid d3e215ac-b5f4-4d63-a3b7-22c9c3720570 {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:16:16.729017 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Updating instance_info_cache with network_info: [{"id": "09224fee-8a35-4c18-b3c4-8c55dc653c62", "address": "fa:16:3e:26:70:26", "network": {"id": "9a6c7309-6d00-4e31-ab3a-4b23c80f5c66", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1690993988-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.5.35", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "13d6c819bc1543bbb5481c22adf548e8", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap09224fee-8a", "ovs_interfaceid": "09224fee-8a35-4c18-b3c4-8c55dc653c62", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:16:16.743030 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Releasing lock "refresh_cache-d3e215ac-b5f4-4d63-a3b7-22c9c3720570" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:16:16.743594 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Updated the network info_cache for instance {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} Aug 30 14:16:16.744462 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:16:16.744889 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=107505) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10461}} Aug 30 14:16:17.025554 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:16:17.025889 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Cleaning up deleted instances {{(pid=107505) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11129}} Aug 30 14:16:17.044399 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] There are 0 instances to clean {{(pid=107505) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11138}} Aug 30 14:16:17.775156 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:16:18.044365 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:16:19.024930 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:16:19.025770 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:16:19.026292 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:16:19.026692 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Cleaning up deleted instances with incomplete migration {{(pid=107505) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11167}} Aug 30 14:16:21.036890 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:16:21.038643 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:16:22.026302 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:16:22.775937 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:16:23.818764 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:16:27.776913 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:16:32.778698 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:16:37.781124 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:16:38.820122 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:16:42.782914 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:16:47.784057 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:16:52.785233 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:16:57.787796 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:17:02.789443 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:17:07.790765 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:17:07.791850 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:17:07.792235 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=107505) run /opt/stack/data/venv/lib/python3.10/site-packages/ovs/reconnect.py:117}} Aug 30 14:17:07.792569 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=107505) _transition /opt/stack/data/venv/lib/python3.10/site-packages/ovs/reconnect.py:519}} Aug 30 14:17:07.793434 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=107505) _transition /opt/stack/data/venv/lib/python3.10/site-packages/ovs/reconnect.py:519}} Aug 30 14:17:07.795388 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:17:08.826074 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:17:11.026521 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager.update_available_resource {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:17:11.051351 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:17:11.051692 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:17:11.052001 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:17:11.052320 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Auditing locally available compute resources for np0035104604 (node: np0035104604) {{(pid=107505) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} Aug 30 14:17:11.052738 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:17:11.664389 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.610s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:17:11.740283 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000012 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:17:11.740560 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000012 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:17:11.744621 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000010 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:17:11.744851 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000010 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:17:11.805230 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Aug 30 14:17:11.806441 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Hypervisor/Node resource view: name=np0035104604 free_ram=1587MB free_disk=29.94009780883789GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "00b8", "vendor_id": "1013", "numa_node": null, "label": "label_1013_00b8", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] {{(pid=107505) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} Aug 30 14:17:11.806710 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:17:11.807038 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:17:12.099854 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance d3e215ac-b5f4-4d63-a3b7-22c9c3720570 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:17:12.100251 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance 5db380b6-b80a-4cb1-b65c-571bfe9b4341 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:17:12.100592 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:17:12.101112 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Total usable vcpus: 8, total allocated vcpus: 3 {{(pid=107505) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} Aug 30 14:17:12.101554 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Final resource view: name=np0035104604 phys_ram=7746MB used_ram=896MB phys_disk=29GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] {{(pid=107505) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} Aug 30 14:17:12.249631 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Refreshing inventories for resource provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} Aug 30 14:17:12.384815 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Updating ProviderTree inventory for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} Aug 30 14:17:12.385575 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Updating inventory in ProviderTree for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Aug 30 14:17:12.512156 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Refreshing aggregate associations for resource provider 600ab55f-530c-4be6-bf02-067d68ce7ee4, aggregates: None {{(pid=107505) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} Aug 30 14:17:12.660593 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Refreshing trait associations for resource provider 600ab55f-530c-4be6-bf02-067d68ce7ee4, traits: COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_QXL,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE42,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_LAN9118,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_RESCUE_BFV,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VMVGA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_SMMUV3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SCSI {{(pid=107505) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} Aug 30 14:17:12.794393 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:17:13.214887 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:17:13.828317 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:17:13.832858 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.618s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:17:13.838266 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:17:13.852540 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:17:13.857205 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Compute_service record updated for np0035104604:np0035104604 {{(pid=107505) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} Aug 30 14:17:13.857654 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.051s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:17:17.796149 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:17:17.857561 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:17:17.858322 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Starting heal instance info cache {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9842}} Aug 30 14:17:17.859009 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Rebuilding the list of instances to heal {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9846}} Aug 30 14:17:17.883061 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: 5db380b6-b80a-4cb1-b65c-571bfe9b4341] Skipping network cache update for instance because it is Building. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9855}} Aug 30 14:17:17.883648 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75] Skipping network cache update for instance because it is Building. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9855}} Aug 30 14:17:18.066674 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "refresh_cache-d3e215ac-b5f4-4d63-a3b7-22c9c3720570" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:17:18.067085 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquired lock "refresh_cache-d3e215ac-b5f4-4d63-a3b7-22c9c3720570" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:17:18.067513 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Forcefully refreshing network info cache for instance {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1996}} Aug 30 14:17:18.068164 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lazy-loading 'info_cache' on Instance uuid d3e215ac-b5f4-4d63-a3b7-22c9c3720570 {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:17:18.934086 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:17:19.382756 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Updating instance_info_cache with network_info: [{"id": "09224fee-8a35-4c18-b3c4-8c55dc653c62", "address": "fa:16:3e:26:70:26", "network": {"id": "9a6c7309-6d00-4e31-ab3a-4b23c80f5c66", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1690993988-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.5.35", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "13d6c819bc1543bbb5481c22adf548e8", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap09224fee-8a", "ovs_interfaceid": "09224fee-8a35-4c18-b3c4-8c55dc653c62", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:17:19.400254 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Releasing lock "refresh_cache-d3e215ac-b5f4-4d63-a3b7-22c9c3720570" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:17:19.400595 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Updated the network info_cache for instance {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} Aug 30 14:17:19.401258 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:17:19.401561 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:17:19.401868 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:17:19.402140 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=107505) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10461}} Aug 30 14:17:20.026910 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:17:20.027748 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:17:21.019029 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:17:22.025511 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:17:22.026118 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:17:22.799126 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:17:23.829819 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:17:27.801820 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:17:28.831910 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:17:32.803788 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:17:37.804946 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:17:42.806176 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:17:43.835335 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:17:47.807939 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:17:52.809279 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:17:57.811037 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:18:02.813621 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:18:07.814378 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:18:11.027026 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager.update_available_resource {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:18:11.054650 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:18:11.055185 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:18:11.055596 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:18:11.056160 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Auditing locally available compute resources for np0035104604 (node: np0035104604) {{(pid=107505) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} Aug 30 14:18:11.056858 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:18:11.415344 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-35eb1f95-a2a9-4ce0-85a6-c375f7ad8020 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Acquiring lock "977b8e8e-7768-4786-9a5a-78006565b459" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:18:11.415948 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-35eb1f95-a2a9-4ce0-85a6-c375f7ad8020 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Lock "977b8e8e-7768-4786-9a5a-78006565b459" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:18:11.432134 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-35eb1f95-a2a9-4ce0-85a6-c375f7ad8020 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] [instance: 977b8e8e-7768-4786-9a5a-78006565b459] Starting instance... {{(pid=107505) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} Aug 30 14:18:11.702634 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-35eb1f95-a2a9-4ce0-85a6-c375f7ad8020 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:18:11.703373 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-35eb1f95-a2a9-4ce0-85a6-c375f7ad8020 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:18:11.709869 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-35eb1f95-a2a9-4ce0-85a6-c375f7ad8020 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=107505) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Aug 30 14:18:11.710343 np0035104604 nova-compute[107505]: INFO nova.compute.claims [None req-35eb1f95-a2a9-4ce0-85a6-c375f7ad8020 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] [instance: 977b8e8e-7768-4786-9a5a-78006565b459] Claim successful on node np0035104604 Aug 30 14:18:11.930374 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.873s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:18:12.443281 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000012 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:18:12.444060 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000012 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:18:12.451174 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000010 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:18:12.451579 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000010 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:18:12.649265 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Aug 30 14:18:12.651396 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Hypervisor/Node resource view: name=np0035104604 free_ram=1579MB free_disk=29.94009780883789GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "00b8", "vendor_id": "1013", "numa_node": null, "label": "label_1013_00b8", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] {{(pid=107505) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} Aug 30 14:18:12.651844 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:18:12.688274 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-35eb1f95-a2a9-4ce0-85a6-c375f7ad8020 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:18:12.817403 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:18:13.410297 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-35eb1f95-a2a9-4ce0-85a6-c375f7ad8020 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.722s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:18:13.416257 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-35eb1f95-a2a9-4ce0-85a6-c375f7ad8020 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:18:13.432206 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-35eb1f95-a2a9-4ce0-85a6-c375f7ad8020 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:18:13.464084 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-35eb1f95-a2a9-4ce0-85a6-c375f7ad8020 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.761s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:18:13.464921 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-35eb1f95-a2a9-4ce0-85a6-c375f7ad8020 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] [instance: 977b8e8e-7768-4786-9a5a-78006565b459] Start building networks asynchronously for instance. {{(pid=107505) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} Aug 30 14:18:13.469422 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.818s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:18:13.673399 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-35eb1f95-a2a9-4ce0-85a6-c375f7ad8020 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] [instance: 977b8e8e-7768-4786-9a5a-78006565b459] Allocating IP information in the background. {{(pid=107505) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} Aug 30 14:18:13.673679 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-35eb1f95-a2a9-4ce0-85a6-c375f7ad8020 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] [instance: 977b8e8e-7768-4786-9a5a-78006565b459] allocate_for_instance() {{(pid=107505) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} Aug 30 14:18:13.791489 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-35eb1f95-a2a9-4ce0-85a6-c375f7ad8020 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] [instance: 977b8e8e-7768-4786-9a5a-78006565b459] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Aug 30 14:18:13.799092 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance d3e215ac-b5f4-4d63-a3b7-22c9c3720570 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:18:13.799329 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance 5db380b6-b80a-4cb1-b65c-571bfe9b4341 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:18:13.799557 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:18:13.799755 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance 977b8e8e-7768-4786-9a5a-78006565b459 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:18:13.800087 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Total usable vcpus: 8, total allocated vcpus: 4 {{(pid=107505) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} Aug 30 14:18:13.800401 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Final resource view: name=np0035104604 phys_ram=7746MB used_ram=1024MB phys_disk=29GB used_disk=4GB total_vcpus=8 used_vcpus=4 pci_stats=[] {{(pid=107505) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} Aug 30 14:18:13.911087 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:18:13.912896 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-35eb1f95-a2a9-4ce0-85a6-c375f7ad8020 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] [instance: 977b8e8e-7768-4786-9a5a-78006565b459] Start building block device mappings for instance. {{(pid=107505) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} Aug 30 14:18:14.167014 np0035104604 nova-compute[107505]: DEBUG nova.policy [None req-35eb1f95-a2a9-4ce0-85a6-c375f7ad8020 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '31659ce5f29944eab1ef53566e15c84f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4a37d1371f13469f8d99c0897c9e3375', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=107505) authorize /opt/stack/nova/nova/policy.py:203}} Aug 30 14:18:14.170132 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-35eb1f95-a2a9-4ce0-85a6-c375f7ad8020 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] [instance: 977b8e8e-7768-4786-9a5a-78006565b459] Start spawning the instance on the hypervisor. {{(pid=107505) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} Aug 30 14:18:14.171168 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-35eb1f95-a2a9-4ce0-85a6-c375f7ad8020 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] [instance: 977b8e8e-7768-4786-9a5a-78006565b459] Creating instance directory {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4717}} Aug 30 14:18:14.171740 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-35eb1f95-a2a9-4ce0-85a6-c375f7ad8020 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] [instance: 977b8e8e-7768-4786-9a5a-78006565b459] Creating image(s) Aug 30 14:18:14.206624 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-35eb1f95-a2a9-4ce0-85a6-c375f7ad8020 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] rbd image 977b8e8e-7768-4786-9a5a-78006565b459_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:18:14.243650 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-35eb1f95-a2a9-4ce0-85a6-c375f7ad8020 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] rbd image 977b8e8e-7768-4786-9a5a-78006565b459_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:18:14.276139 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-35eb1f95-a2a9-4ce0-85a6-c375f7ad8020 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] rbd image 977b8e8e-7768-4786-9a5a-78006565b459_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:18:14.280903 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-35eb1f95-a2a9-4ce0-85a6-c375f7ad8020 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e --force-share --output=json {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:18:14.460198 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-35eb1f95-a2a9-4ce0-85a6-c375f7ad8020 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] CMD "/opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e --force-share --output=json" returned: 0 in 0.179s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:18:14.461262 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-35eb1f95-a2a9-4ce0-85a6-c375f7ad8020 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Acquiring lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:18:14.462518 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-35eb1f95-a2a9-4ce0-85a6-c375f7ad8020 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:18:14.463124 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-35eb1f95-a2a9-4ce0-85a6-c375f7ad8020 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:18:14.498025 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-35eb1f95-a2a9-4ce0-85a6-c375f7ad8020 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] rbd image 977b8e8e-7768-4786-9a5a-78006565b459_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:18:14.502263 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-35eb1f95-a2a9-4ce0-85a6-c375f7ad8020 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e 977b8e8e-7768-4786-9a5a-78006565b459_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:18:15.019739 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:18:15.047544 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:18:15.104296 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-35eb1f95-a2a9-4ce0-85a6-c375f7ad8020 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] [instance: 977b8e8e-7768-4786-9a5a-78006565b459] Successfully created port: dd8b6de4-9794-464f-8e57-2d0d6019b1e0 {{(pid=107505) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} Aug 30 14:18:15.817976 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.771s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:18:15.828552 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:18:15.847984 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:18:15.854304 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Compute_service record updated for np0035104604:np0035104604 {{(pid=107505) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} Aug 30 14:18:15.856664 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.387s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:18:16.275756 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-35eb1f95-a2a9-4ce0-85a6-c375f7ad8020 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] [instance: 977b8e8e-7768-4786-9a5a-78006565b459] Successfully updated port: dd8b6de4-9794-464f-8e57-2d0d6019b1e0 {{(pid=107505) _update_port /opt/stack/nova/nova/network/neutron.py:586}} Aug 30 14:18:16.290643 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-35eb1f95-a2a9-4ce0-85a6-c375f7ad8020 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Acquiring lock "refresh_cache-977b8e8e-7768-4786-9a5a-78006565b459" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:18:16.291338 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-35eb1f95-a2a9-4ce0-85a6-c375f7ad8020 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Acquired lock "refresh_cache-977b8e8e-7768-4786-9a5a-78006565b459" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:18:16.291815 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-35eb1f95-a2a9-4ce0-85a6-c375f7ad8020 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] [instance: 977b8e8e-7768-4786-9a5a-78006565b459] Building network info cache for instance {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} Aug 30 14:18:16.435090 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-100b339d-dd7e-4722-87ad-08b1176ba659 req-f8ab9b17-9bcb-4135-98d5-2b7a4da9c5f1 service nova] [instance: 977b8e8e-7768-4786-9a5a-78006565b459] Received event network-changed-dd8b6de4-9794-464f-8e57-2d0d6019b1e0 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:18:16.435469 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-100b339d-dd7e-4722-87ad-08b1176ba659 req-f8ab9b17-9bcb-4135-98d5-2b7a4da9c5f1 service nova] [instance: 977b8e8e-7768-4786-9a5a-78006565b459] Refreshing instance network info cache due to event network-changed-dd8b6de4-9794-464f-8e57-2d0d6019b1e0. {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11037}} Aug 30 14:18:16.435921 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-100b339d-dd7e-4722-87ad-08b1176ba659 req-f8ab9b17-9bcb-4135-98d5-2b7a4da9c5f1 service nova] Acquiring lock "refresh_cache-977b8e8e-7768-4786-9a5a-78006565b459" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:18:16.485579 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-35eb1f95-a2a9-4ce0-85a6-c375f7ad8020 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] [instance: 977b8e8e-7768-4786-9a5a-78006565b459] Instance cache missing network info. {{(pid=107505) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} Aug 30 14:18:17.374445 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-35eb1f95-a2a9-4ce0-85a6-c375f7ad8020 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] [instance: 977b8e8e-7768-4786-9a5a-78006565b459] Updating instance_info_cache with network_info: [{"id": "dd8b6de4-9794-464f-8e57-2d0d6019b1e0", "address": "fa:16:3e:6f:9c:97", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.99", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd8b6de4-97", "ovs_interfaceid": "dd8b6de4-9794-464f-8e57-2d0d6019b1e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:18:17.393085 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-35eb1f95-a2a9-4ce0-85a6-c375f7ad8020 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Releasing lock "refresh_cache-977b8e8e-7768-4786-9a5a-78006565b459" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:18:17.393857 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-35eb1f95-a2a9-4ce0-85a6-c375f7ad8020 tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] [instance: 977b8e8e-7768-4786-9a5a-78006565b459] Instance network_info: |[{"id": "dd8b6de4-9794-464f-8e57-2d0d6019b1e0", "address": "fa:16:3e:6f:9c:97", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.99", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd8b6de4-97", "ovs_interfaceid": "dd8b6de4-9794-464f-8e57-2d0d6019b1e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=107505) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} Aug 30 14:18:17.394463 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-100b339d-dd7e-4722-87ad-08b1176ba659 req-f8ab9b17-9bcb-4135-98d5-2b7a4da9c5f1 service nova] Acquired lock "refresh_cache-977b8e8e-7768-4786-9a5a-78006565b459" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:18:17.394876 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-100b339d-dd7e-4722-87ad-08b1176ba659 req-f8ab9b17-9bcb-4135-98d5-2b7a4da9c5f1 service nova] [instance: 977b8e8e-7768-4786-9a5a-78006565b459] Refreshing network info cache for port dd8b6de4-9794-464f-8e57-2d0d6019b1e0 {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} Aug 30 14:18:17.921339 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:18:18.673475 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-100b339d-dd7e-4722-87ad-08b1176ba659 req-f8ab9b17-9bcb-4135-98d5-2b7a4da9c5f1 service nova] [instance: 977b8e8e-7768-4786-9a5a-78006565b459] Updated VIF entry in instance network info cache for port dd8b6de4-9794-464f-8e57-2d0d6019b1e0. {{(pid=107505) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} Aug 30 14:18:18.674305 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-100b339d-dd7e-4722-87ad-08b1176ba659 req-f8ab9b17-9bcb-4135-98d5-2b7a4da9c5f1 service nova] [instance: 977b8e8e-7768-4786-9a5a-78006565b459] Updating instance_info_cache with network_info: [{"id": "dd8b6de4-9794-464f-8e57-2d0d6019b1e0", "address": "fa:16:3e:6f:9c:97", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.99", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd8b6de4-97", "ovs_interfaceid": "dd8b6de4-9794-464f-8e57-2d0d6019b1e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:18:18.687663 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-100b339d-dd7e-4722-87ad-08b1176ba659 req-f8ab9b17-9bcb-4135-98d5-2b7a4da9c5f1 service nova] Releasing lock "refresh_cache-977b8e8e-7768-4786-9a5a-78006565b459" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:18:18.842737 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:18:19.857419 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:18:19.858190 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Starting heal instance info cache {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9842}} Aug 30 14:18:19.858435 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Rebuilding the list of instances to heal {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9846}} Aug 30 14:18:19.885340 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: 5db380b6-b80a-4cb1-b65c-571bfe9b4341] Skipping network cache update for instance because it is Building. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9855}} Aug 30 14:18:19.885675 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75] Skipping network cache update for instance because it is Building. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9855}} Aug 30 14:18:19.885917 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: 977b8e8e-7768-4786-9a5a-78006565b459] Skipping network cache update for instance because it is Building. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9855}} Aug 30 14:18:20.044762 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "refresh_cache-d3e215ac-b5f4-4d63-a3b7-22c9c3720570" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:18:20.044992 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquired lock "refresh_cache-d3e215ac-b5f4-4d63-a3b7-22c9c3720570" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:18:20.045396 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Forcefully refreshing network info cache for instance {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1996}} Aug 30 14:18:20.045783 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lazy-loading 'info_cache' on Instance uuid d3e215ac-b5f4-4d63-a3b7-22c9c3720570 {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:18:21.354401 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Updating instance_info_cache with network_info: [{"id": "09224fee-8a35-4c18-b3c4-8c55dc653c62", "address": "fa:16:3e:26:70:26", "network": {"id": "9a6c7309-6d00-4e31-ab3a-4b23c80f5c66", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1690993988-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.5.35", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "13d6c819bc1543bbb5481c22adf548e8", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap09224fee-8a", "ovs_interfaceid": "09224fee-8a35-4c18-b3c4-8c55dc653c62", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:18:21.375205 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Releasing lock "refresh_cache-d3e215ac-b5f4-4d63-a3b7-22c9c3720570" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:18:21.375492 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Updated the network info_cache for instance {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} Aug 30 14:18:21.375827 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:18:21.376528 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:18:21.376712 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:18:21.377491 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:18:21.377624 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=107505) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10461}} Aug 30 14:18:22.026173 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:18:22.026635 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:18:22.027083 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:18:22.923697 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:18:23.844581 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:18:27.925460 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:18:28.845916 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:18:32.927135 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:18:37.928772 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:18:42.930543 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:18:47.933835 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:18:48.852417 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:18:52.935993 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:18:53.853240 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:18:57.938042 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:19:02.940053 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:19:05.713456 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-5a7d61e8-40a3-4879-b8ed-6990d61035d2 tempest-VolumesAdminNegativeTest-416011200 tempest-VolumesAdminNegativeTest-416011200-project-member] Acquiring lock "f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:19:07.941255 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:19:12.025338 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager.update_available_resource {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:19:12.051079 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:19:12.051606 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:19:12.052001 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:19:12.052395 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Auditing locally available compute resources for np0035104604 (node: np0035104604) {{(pid=107505) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} Aug 30 14:19:12.052895 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:19:12.644836 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.591s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:19:12.718526 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000012 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:19:12.718921 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000012 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:19:12.724522 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000010 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:19:12.724892 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000010 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:19:12.807033 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Aug 30 14:19:12.809032 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Hypervisor/Node resource view: name=np0035104604 free_ram=1547MB free_disk=29.920181274414062GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "00b8", "vendor_id": "1013", "numa_node": null, "label": "label_1013_00b8", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] {{(pid=107505) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} Aug 30 14:19:12.809620 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:19:12.810200 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:19:13.003145 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:19:13.040436 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance d3e215ac-b5f4-4d63-a3b7-22c9c3720570 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:19:13.040992 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance 5db380b6-b80a-4cb1-b65c-571bfe9b4341 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:19:13.041281 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:19:13.041693 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance 977b8e8e-7768-4786-9a5a-78006565b459 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:19:13.042097 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Total usable vcpus: 8, total allocated vcpus: 4 {{(pid=107505) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} Aug 30 14:19:13.042321 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Final resource view: name=np0035104604 phys_ram=7746MB used_ram=1024MB phys_disk=29GB used_disk=4GB total_vcpus=8 used_vcpus=4 pci_stats=[] {{(pid=107505) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} Aug 30 14:19:13.462430 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-b34e2f5e-f74c-4184-a2a1-52d9766560f2 tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Get console output Aug 30 14:19:13.471142 np0035104604 nova-compute[107505]: INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes Aug 30 14:19:13.770753 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-ed2af711-3f3a-44f8-88c5-2e879f829e3c tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Acquiring lock "d3e215ac-b5f4-4d63-a3b7-22c9c3720570" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:19:13.771064 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-ed2af711-3f3a-44f8-88c5-2e879f829e3c tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Lock "d3e215ac-b5f4-4d63-a3b7-22c9c3720570" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:19:13.771503 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-ed2af711-3f3a-44f8-88c5-2e879f829e3c tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Acquiring lock "d3e215ac-b5f4-4d63-a3b7-22c9c3720570-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:19:13.771887 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-ed2af711-3f3a-44f8-88c5-2e879f829e3c tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Lock "d3e215ac-b5f4-4d63-a3b7-22c9c3720570-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:19:13.772325 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-ed2af711-3f3a-44f8-88c5-2e879f829e3c tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Lock "d3e215ac-b5f4-4d63-a3b7-22c9c3720570-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:19:13.779097 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-ed2af711-3f3a-44f8-88c5-2e879f829e3c tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Terminating instance Aug 30 14:19:13.781753 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-ed2af711-3f3a-44f8-88c5-2e879f829e3c tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Start destroying the instance on the hypervisor. {{(pid=107505) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} Aug 30 14:19:13.797310 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:19:18.007124 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:19:23.011223 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:19:23.822701 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:19:23.833240 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:19:23.850563 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:19:23.881322 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:19:23.899425 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:19:24.025983 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Instance destroyed successfully. Aug 30 14:19:24.026922 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-ed2af711-3f3a-44f8-88c5-2e879f829e3c tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Lazy-loading 'resources' on Instance uuid d3e215ac-b5f4-4d63-a3b7-22c9c3720570 {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:19:24.045577 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-ed2af711-3f3a-44f8-88c5-2e879f829e3c tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2023-08-30T14:10:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServersTestManualDisk-server-1906138070',display_name='tempest-ServersTestManualDisk-server-1906138070',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-serverstestmanualdisk-server-1906138070',id=16,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPvXTXjVdhbUpv92S2TvfzMPoYC1l9OHahiBNeQMj3dh0MCCgAqnLpeVXL58D1mC9IzKrvbUKlqfO/pFjAzxhlD0s9T6CISLPK22eqeT5xx1Ul8L+CUPcem/Mk0/tnDjiA==',key_name='tempest-keypair-230544950',keypairs=,launch_index=0,launched_at=2023-08-30T14:10:46Z,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='13d6c819bc1543bbb5481c22adf548e8',ramdisk_id='',reservation_id='r-57raj7c0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServersTestManualDisk-57483657',owner_user_name='tempest-ServersTestManualDisk-57483657-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-08-30T14:10:46Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='99da94c3250e4df08bef191e04588171',uuid=d3e215ac-b5f4-4d63-a3b7-22c9c3720570,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "09224fee-8a35-4c18-b3c4-8c55dc653c62", "address": "fa:16:3e:26:70:26", "network": {"id": "9a6c7309-6d00-4e31-ab3a-4b23c80f5c66", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1690993988-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.5.35", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "13d6c819bc1543bbb5481c22adf548e8", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap09224fee-8a", "ovs_interfaceid": "09224fee-8a35-4c18-b3c4-8c55dc653c62", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Aug 30 14:19:24.046275 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-ed2af711-3f3a-44f8-88c5-2e879f829e3c tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Converting VIF {"id": "09224fee-8a35-4c18-b3c4-8c55dc653c62", "address": "fa:16:3e:26:70:26", "network": {"id": "9a6c7309-6d00-4e31-ab3a-4b23c80f5c66", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1690993988-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.5.35", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "13d6c819bc1543bbb5481c22adf548e8", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap09224fee-8a", "ovs_interfaceid": "09224fee-8a35-4c18-b3c4-8c55dc653c62", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:19:24.048237 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-ed2af711-3f3a-44f8-88c5-2e879f829e3c tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:26:70:26,bridge_name='br-int',has_traffic_filtering=True,id=09224fee-8a35-4c18-b3c4-8c55dc653c62,network=Network(9a6c7309-6d00-4e31-ab3a-4b23c80f5c66),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09224fee-8a') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:19:24.048957 np0035104604 nova-compute[107505]: DEBUG os_vif [None req-ed2af711-3f3a-44f8-88c5-2e879f829e3c tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:26:70:26,bridge_name='br-int',has_traffic_filtering=True,id=09224fee-8a35-4c18-b3c4-8c55dc653c62,network=Network(9a6c7309-6d00-4e31-ab3a-4b23c80f5c66),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09224fee-8a') {{(pid=107505) unplug /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:109}} Aug 30 14:19:24.052649 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:19:24.053215 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap09224fee-8a, bridge=br-int, if_exists=True) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:19:24.055133 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:19:24.059866 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:19:24.061497 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:19:24.067842 np0035104604 nova-compute[107505]: INFO os_vif [None req-ed2af711-3f3a-44f8-88c5-2e879f829e3c tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:26:70:26,bridge_name='br-int',has_traffic_filtering=True,id=09224fee-8a35-4c18-b3c4-8c55dc653c62,network=Network(9a6c7309-6d00-4e31-ab3a-4b23c80f5c66),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09224fee-8a') Aug 30 14:19:37.828818 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:19:37.832235 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:19:37.837369 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:19:37.837369 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering BACKOFF {{(pid=107505) _transition /opt/stack/data/venv/lib/python3.10/site-packages/ovs/reconnect.py:519}} Aug 30 14:19:37.847077 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 24.050s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:19:37.860523 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:19:37.877048 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:19:37.958674 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Compute_service record updated for np0035104604:np0035104604 {{(pid=107505) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} Aug 30 14:19:37.959142 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 25.149s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:19:38.244516 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-dccb33f4-bc61-4e18-be7f-6c321fccf600 req-796c95be-e509-4596-b95d-54ebe9de1828 service nova] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Received event network-vif-unplugged-09224fee-8a35-4c18-b3c4-8c55dc653c62 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:19:38.245078 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-dccb33f4-bc61-4e18-be7f-6c321fccf600 req-796c95be-e509-4596-b95d-54ebe9de1828 service nova] Acquiring lock "d3e215ac-b5f4-4d63-a3b7-22c9c3720570-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:19:38.245698 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-dccb33f4-bc61-4e18-be7f-6c321fccf600 req-796c95be-e509-4596-b95d-54ebe9de1828 service nova] Lock "d3e215ac-b5f4-4d63-a3b7-22c9c3720570-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:19:38.246121 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-dccb33f4-bc61-4e18-be7f-6c321fccf600 req-796c95be-e509-4596-b95d-54ebe9de1828 service nova] Lock "d3e215ac-b5f4-4d63-a3b7-22c9c3720570-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:19:38.246558 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-dccb33f4-bc61-4e18-be7f-6c321fccf600 req-796c95be-e509-4596-b95d-54ebe9de1828 service nova] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] No waiting events found dispatching network-vif-unplugged-09224fee-8a35-4c18-b3c4-8c55dc653c62 {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:19:38.246990 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-dccb33f4-bc61-4e18-be7f-6c321fccf600 req-796c95be-e509-4596-b95d-54ebe9de1828 service nova] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Received event network-vif-unplugged-09224fee-8a35-4c18-b3c4-8c55dc653c62 for instance with task_state deleting. {{(pid=107505) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10810}} Aug 30 14:19:38.835705 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 999-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:19:38.837556 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering CONNECTING {{(pid=107505) _transition /opt/stack/data/venv/lib/python3.10/site-packages/ovs/reconnect.py:519}} Aug 30 14:19:38.837995 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLOUT] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:19:38.838252 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=107505) _transition /opt/stack/data/venv/lib/python3.10/site-packages/ovs/reconnect.py:519}} Aug 30 14:19:38.839794 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:19:38.844585 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:19:38.886474 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:19:39.024575 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:19:39.024892 np0035104604 nova-compute[107505]: INFO nova.compute.manager [-] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] VM Stopped (Lifecycle Event) Aug 30 14:19:39.047760 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-12aee860-40b7-4147-b965-3fd7d6b77ca2 None None] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:19:39.053608 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-12aee860-40b7-4147-b965-3fd7d6b77ca2 None None] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: active, current task_state: deleting, current DB power_state: 1, VM power_state: 4 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:19:39.078281 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-12aee860-40b7-4147-b965-3fd7d6b77ca2 None None] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] During sync_power_state the instance has a pending task (deleting). Skip. Aug 30 14:19:40.292589 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-df70c799-b578-476a-8c0a-96d3aee84158 req-98e19ec5-555e-4ae9-8529-47a5e23fd6a2 service nova] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Received event network-vif-plugged-09224fee-8a35-4c18-b3c4-8c55dc653c62 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:19:40.292589 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-df70c799-b578-476a-8c0a-96d3aee84158 req-98e19ec5-555e-4ae9-8529-47a5e23fd6a2 service nova] Acquiring lock "d3e215ac-b5f4-4d63-a3b7-22c9c3720570-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:19:40.293142 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-df70c799-b578-476a-8c0a-96d3aee84158 req-98e19ec5-555e-4ae9-8529-47a5e23fd6a2 service nova] Lock "d3e215ac-b5f4-4d63-a3b7-22c9c3720570-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:19:40.293142 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-df70c799-b578-476a-8c0a-96d3aee84158 req-98e19ec5-555e-4ae9-8529-47a5e23fd6a2 service nova] Lock "d3e215ac-b5f4-4d63-a3b7-22c9c3720570-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:19:40.293426 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-df70c799-b578-476a-8c0a-96d3aee84158 req-98e19ec5-555e-4ae9-8529-47a5e23fd6a2 service nova] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] No waiting events found dispatching network-vif-plugged-09224fee-8a35-4c18-b3c4-8c55dc653c62 {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:19:40.293759 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [req-df70c799-b578-476a-8c0a-96d3aee84158 req-98e19ec5-555e-4ae9-8529-47a5e23fd6a2 service nova] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Received unexpected event network-vif-plugged-09224fee-8a35-4c18-b3c4-8c55dc653c62 for instance with vm_state active and task_state deleting. Aug 30 14:19:41.960678 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:19:41.961292 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:19:41.961829 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:19:41.962293 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Starting heal instance info cache {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9842}} Aug 30 14:19:41.962611 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Rebuilding the list of instances to heal {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9846}} Aug 30 14:19:41.982939 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Skipping network cache update for instance because it is being deleted. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9859}} Aug 30 14:19:41.983274 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: 5db380b6-b80a-4cb1-b65c-571bfe9b4341] Skipping network cache update for instance because it is Building. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9855}} Aug 30 14:19:41.983615 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75] Skipping network cache update for instance because it is Building. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9855}} Aug 30 14:19:41.983940 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: 977b8e8e-7768-4786-9a5a-78006565b459] Skipping network cache update for instance because it is Building. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9855}} Aug 30 14:19:41.984292 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Didn't find any instances for network info cache update. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9928}} Aug 30 14:19:41.984988 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:19:41.985340 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:19:41.985703 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:19:41.986086 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:19:41.986493 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:19:41.986902 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:19:41.987211 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=107505) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10461}} Aug 30 14:19:43.886776 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:19:48.889457 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:19:48.890339 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:19:48.890798 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=107505) run /opt/stack/data/venv/lib/python3.10/site-packages/ovs/reconnect.py:117}} Aug 30 14:19:48.891190 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=107505) _transition /opt/stack/data/venv/lib/python3.10/site-packages/ovs/reconnect.py:519}} Aug 30 14:19:48.891955 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=107505) _transition /opt/stack/data/venv/lib/python3.10/site-packages/ovs/reconnect.py:519}} Aug 30 14:19:48.892673 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:19:50.076419 np0035104604 nova-compute[107505]: WARNING nova.storage.rbd_utils [-] rbd remove d3e215ac-b5f4-4d63-a3b7-22c9c3720570_disk in pool vms failed: rbd.ImageBusy: [errno 16] RBD image is busy (error removing image) Aug 30 14:19:50.077075 np0035104604 nova-compute[107505]: WARNING oslo.service.loopingcall [-] Function 'nova.storage.rbd_utils.RBDDriver._destroy_volume.._cleanup_vol' run outlasted interval by 7.24 sec Aug 30 14:19:50.111122 np0035104604 nova-compute[107505]: WARNING nova.storage.rbd_utils [-] rbd remove d3e215ac-b5f4-4d63-a3b7-22c9c3720570_disk in pool vms failed: rbd.ImageBusy: [errno 16] RBD image is busy (error removing image) Aug 30 14:19:53.894372 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:19:55.114452 np0035104604 nova-compute[107505]: WARNING nova.storage.rbd_utils [-] rbd remove d3e215ac-b5f4-4d63-a3b7-22c9c3720570_disk in pool vms failed: rbd.ImageBusy: [errno 16] RBD image is busy (error removing image) Aug 30 14:19:58.895824 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:19:58.897533 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:19:58.897825 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=107505) run /opt/stack/data/venv/lib/python3.10/site-packages/ovs/reconnect.py:117}} Aug 30 14:19:58.898083 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=107505) _transition /opt/stack/data/venv/lib/python3.10/site-packages/ovs/reconnect.py:519}} Aug 30 14:19:58.898553 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=107505) _transition /opt/stack/data/venv/lib/python3.10/site-packages/ovs/reconnect.py:519}} Aug 30 14:19:58.899217 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:20:00.119327 np0035104604 nova-compute[107505]: WARNING nova.storage.rbd_utils [-] rbd remove d3e215ac-b5f4-4d63-a3b7-22c9c3720570_disk in pool vms failed: rbd.ImageBusy: [errno 16] RBD image is busy (error removing image) Aug 30 14:20:03.899967 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:20:05.122707 np0035104604 nova-compute[107505]: WARNING nova.storage.rbd_utils [-] rbd remove d3e215ac-b5f4-4d63-a3b7-22c9c3720570_disk in pool vms failed: rbd.ImageBusy: [errno 16] RBD image is busy (error removing image) Aug 30 14:20:08.901714 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:20:08.903489 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:20:08.903775 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=107505) run /opt/stack/data/venv/lib/python3.10/site-packages/ovs/reconnect.py:117}} Aug 30 14:20:08.904021 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=107505) _transition /opt/stack/data/venv/lib/python3.10/site-packages/ovs/reconnect.py:519}} Aug 30 14:20:08.904747 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=107505) _transition /opt/stack/data/venv/lib/python3.10/site-packages/ovs/reconnect.py:519}} Aug 30 14:20:08.905364 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:20:10.126377 np0035104604 nova-compute[107505]: WARNING nova.storage.rbd_utils [-] rbd remove d3e215ac-b5f4-4d63-a3b7-22c9c3720570_disk in pool vms failed: rbd.ImageBusy: [errno 16] RBD image is busy (error removing image) Aug 30 14:20:13.907029 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:20:14.025895 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager.update_available_resource {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:20:14.052973 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:20:14.053891 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:20:14.054434 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:20:14.054972 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Auditing locally available compute resources for np0035104604 (node: np0035104604) {{(pid=107505) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} Aug 30 14:20:14.055644 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:20:14.776646 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.720s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:20:14.853427 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000012 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:20:14.853699 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000012 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:20:14.858346 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000010 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:20:14.858666 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000010 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:20:14.918541 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Aug 30 14:20:14.919806 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Hypervisor/Node resource view: name=np0035104604 free_ram=1796MB free_disk=29.920181274414062GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "00b8", "vendor_id": "1013", "numa_node": null, "label": "label_1013_00b8", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] {{(pid=107505) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} Aug 30 14:20:14.920073 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:20:14.920459 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:20:15.157780 np0035104604 nova-compute[107505]: WARNING nova.storage.rbd_utils [-] rbd remove d3e215ac-b5f4-4d63-a3b7-22c9c3720570_disk in pool vms failed: rbd.ImageBusy: [errno 16] RBD image is busy (error removing image) Aug 30 14:20:15.171407 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance d3e215ac-b5f4-4d63-a3b7-22c9c3720570 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:20:15.171796 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance 5db380b6-b80a-4cb1-b65c-571bfe9b4341 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:20:15.172047 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:20:15.172349 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance 977b8e8e-7768-4786-9a5a-78006565b459 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:20:15.172682 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Total usable vcpus: 8, total allocated vcpus: 4 {{(pid=107505) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} Aug 30 14:20:15.172954 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Final resource view: name=np0035104604 phys_ram=7746MB used_ram=1024MB phys_disk=29GB used_disk=4GB total_vcpus=8 used_vcpus=4 pci_stats=[] {{(pid=107505) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} Aug 30 14:20:15.850576 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:20:16.452457 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.601s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:20:16.460969 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:20:16.480767 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:20:16.519364 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Compute_service record updated for np0035104604:np0035104604 {{(pid=107505) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} Aug 30 14:20:16.519756 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.599s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:20:18.910136 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:20:20.161234 np0035104604 nova-compute[107505]: WARNING nova.storage.rbd_utils [-] rbd remove d3e215ac-b5f4-4d63-a3b7-22c9c3720570_disk in pool vms failed: rbd.ImageBusy: [errno 16] RBD image is busy (error removing image) Aug 30 14:20:20.521027 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:20:20.521974 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Starting heal instance info cache {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9842}} Aug 30 14:20:20.522670 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Rebuilding the list of instances to heal {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9846}} Aug 30 14:20:20.552949 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Skipping network cache update for instance because it is being deleted. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9859}} Aug 30 14:20:20.553778 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: 5db380b6-b80a-4cb1-b65c-571bfe9b4341] Skipping network cache update for instance because it is Building. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9855}} Aug 30 14:20:20.554480 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75] Skipping network cache update for instance because it is Building. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9855}} Aug 30 14:20:20.555289 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: 977b8e8e-7768-4786-9a5a-78006565b459] Skipping network cache update for instance because it is Building. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9855}} Aug 30 14:20:20.555982 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Didn't find any instances for network info cache update. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9928}} Aug 30 14:20:20.557101 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:20:22.025266 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:20:22.026138 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:20:22.026554 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:20:22.027069 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=107505) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10461}} Aug 30 14:20:23.029267 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:20:23.029267 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:20:23.913363 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:20:24.026069 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:20:25.155749 np0035104604 nova-compute[107505]: WARNING nova.storage.rbd_utils [-] rbd remove d3e215ac-b5f4-4d63-a3b7-22c9c3720570_disk in pool vms failed: rbd.ImageBusy: [errno 16] RBD image is busy (error removing image) Aug 30 14:20:28.914555 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:20:28.916944 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:20:28.918059 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=107505) run /opt/stack/data/venv/lib/python3.10/site-packages/ovs/reconnect.py:117}} Aug 30 14:20:28.918799 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=107505) _transition /opt/stack/data/venv/lib/python3.10/site-packages/ovs/reconnect.py:519}} Aug 30 14:20:28.920377 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=107505) _transition /opt/stack/data/venv/lib/python3.10/site-packages/ovs/reconnect.py:519}} Aug 30 14:20:28.922820 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:20:30.173718 np0035104604 nova-compute[107505]: WARNING nova.storage.rbd_utils [-] rbd remove d3e215ac-b5f4-4d63-a3b7-22c9c3720570_disk in pool vms failed: rbd.ImageBusy: [errno 16] RBD image is busy (error removing image) Aug 30 14:20:33.921491 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:20:35.169842 np0035104604 nova-compute[107505]: WARNING nova.storage.rbd_utils [-] rbd remove d3e215ac-b5f4-4d63-a3b7-22c9c3720570_disk in pool vms failed: rbd.ImageBusy: [errno 16] RBD image is busy (error removing image) Aug 30 14:20:38.924842 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:20:40.167211 np0035104604 nova-compute[107505]: WARNING nova.storage.rbd_utils [-] rbd remove d3e215ac-b5f4-4d63-a3b7-22c9c3720570_disk in pool vms failed: rbd.ImageBusy: [errno 16] RBD image is busy (error removing image) Aug 30 14:20:40.190759 np0035104604 nova-compute[107505]: WARNING nova.storage.rbd_utils [None req-ed2af711-3f3a-44f8-88c5-2e879f829e3c tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] rbd remove d3e215ac-b5f4-4d63-a3b7-22c9c3720570_disk in pool vms failed: rbd.ImageBusy: [errno 16] RBD image is busy (error removing image) Aug 30 14:20:40.321460 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-ed2af711-3f3a-44f8-88c5-2e879f829e3c tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Deleting instance files /opt/stack/data/nova/instances/d3e215ac-b5f4-4d63-a3b7-22c9c3720570_del Aug 30 14:20:40.322903 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-ed2af711-3f3a-44f8-88c5-2e879f829e3c tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Deletion of /opt/stack/data/nova/instances/d3e215ac-b5f4-4d63-a3b7-22c9c3720570_del complete Aug 30 14:20:40.384340 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-ed2af711-3f3a-44f8-88c5-2e879f829e3c tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Took 86.60 seconds to destroy the instance on the hypervisor. Aug 30 14:20:40.385087 np0035104604 nova-compute[107505]: DEBUG oslo.service.loopingcall [None req-ed2af711-3f3a-44f8-88c5-2e879f829e3c tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=107505) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} Aug 30 14:20:40.385567 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [-] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Deallocating network for instance {{(pid=107505) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} Aug 30 14:20:40.385829 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [-] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] deallocate_for_instance() {{(pid=107505) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} Aug 30 14:20:41.331330 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:20:41.360126 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [-] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Updating instance_info_cache with network_info: [] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:20:41.383244 np0035104604 nova-compute[107505]: INFO nova.compute.manager [-] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Took 1.00 seconds to deallocate network for instance. Aug 30 14:20:41.401247 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-8fd494d9-57bb-416c-90d4-7fc2c350c537 req-7bca4b7c-c094-41f9-aebc-bcdb86b16cc7 service nova] [instance: d3e215ac-b5f4-4d63-a3b7-22c9c3720570] Received event network-vif-deleted-09224fee-8a35-4c18-b3c4-8c55dc653c62 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:20:41.450982 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-ed2af711-3f3a-44f8-88c5-2e879f829e3c tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:20:41.451804 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-ed2af711-3f3a-44f8-88c5-2e879f829e3c tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:20:42.216699 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-ed2af711-3f3a-44f8-88c5-2e879f829e3c tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:20:42.845409 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-ed2af711-3f3a-44f8-88c5-2e879f829e3c tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.627s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:20:42.857582 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-ed2af711-3f3a-44f8-88c5-2e879f829e3c tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:20:42.873896 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-ed2af711-3f3a-44f8-88c5-2e879f829e3c tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:20:42.913846 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-ed2af711-3f3a-44f8-88c5-2e879f829e3c tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.462s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:20:43.080427 np0035104604 nova-compute[107505]: INFO nova.scheduler.client.report [None req-ed2af711-3f3a-44f8-88c5-2e879f829e3c tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Deleted allocations for instance d3e215ac-b5f4-4d63-a3b7-22c9c3720570 Aug 30 14:20:43.158817 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-ed2af711-3f3a-44f8-88c5-2e879f829e3c tempest-ServersTestManualDisk-57483657 tempest-ServersTestManualDisk-57483657-project-member] Lock "d3e215ac-b5f4-4d63-a3b7-22c9c3720570" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 89.388s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:20:43.926047 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:20:47.791659 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:20:48.928459 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:20:52.848275 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:20:53.877235 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:20:53.929709 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:20:55.251320 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:20:58.459979 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Acquiring lock "6d52223d-efe0-461c-b964-611dc2f3103e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:20:58.461082 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Lock "6d52223d-efe0-461c-b964-611dc2f3103e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:20:58.482214 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Starting instance... {{(pid=107505) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} Aug 30 14:20:58.519754 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Acquiring lock "d08ac2a0-d12d-46e9-bf89-490056df3cb5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:20:58.520102 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Lock "d08ac2a0-d12d-46e9-bf89-490056df3cb5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:20:58.538870 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: d08ac2a0-d12d-46e9-bf89-490056df3cb5] Starting instance... {{(pid=107505) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} Aug 30 14:20:58.833464 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:20:58.834041 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:20:58.843517 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=107505) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Aug 30 14:20:58.844132 np0035104604 nova-compute[107505]: INFO nova.compute.claims [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Claim successful on node np0035104604 Aug 30 14:20:58.854314 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:20:58.879176 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:20:58.930982 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:21:03.880697 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:21:03.933127 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:21:08.882618 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:21:08.934593 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:21:13.936826 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:21:16.026334 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager.update_available_resource {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:21:16.053897 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:21:18.938630 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:21:23.651502 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:21:23.885902 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:21:23.939957 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:21:24.421540 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.770s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:21:24.431843 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:21:24.452520 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:21:24.486448 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 25.652s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:21:24.487487 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Start building networks asynchronously for instance. {{(pid=107505) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} Aug 30 14:21:24.493164 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 25.639s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:21:24.501890 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=107505) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Aug 30 14:21:24.501890 np0035104604 nova-compute[107505]: INFO nova.compute.claims [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: d08ac2a0-d12d-46e9-bf89-490056df3cb5] Claim successful on node np0035104604 Aug 30 14:21:24.591101 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Allocating IP information in the background. {{(pid=107505) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} Aug 30 14:21:24.591406 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] allocate_for_instance() {{(pid=107505) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} Aug 30 14:21:24.724711 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Aug 30 14:21:24.848018 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Start building block device mappings for instance. {{(pid=107505) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} Aug 30 14:21:24.876990 np0035104604 nova-compute[107505]: DEBUG nova.policy [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fcbf43c19164442abf5d6e6303211228', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fb6a45f33bd94db2adee5047c736ad16', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=107505) authorize /opt/stack/nova/nova/policy.py:203}} Aug 30 14:21:25.311106 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Start spawning the instance on the hypervisor. {{(pid=107505) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} Aug 30 14:21:25.312469 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Creating instance directory {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4717}} Aug 30 14:21:25.313118 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Creating image(s) Aug 30 14:21:25.354166 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] rbd image 6d52223d-efe0-461c-b964-611dc2f3103e_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:21:25.389129 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] rbd image 6d52223d-efe0-461c-b964-611dc2f3103e_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:21:25.419369 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] rbd image 6d52223d-efe0-461c-b964-611dc2f3103e_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:21:25.424767 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e --force-share --output=json {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:21:25.621802 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] CMD "/opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e --force-share --output=json" returned: 0 in 0.197s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:21:25.622426 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Acquiring lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:21:25.623267 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:21:25.623636 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:21:25.652176 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] rbd image 6d52223d-efe0-461c-b964-611dc2f3103e_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:21:25.655522 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e 6d52223d-efe0-461c-b964-611dc2f3103e_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:21:25.913010 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Successfully created port: f91dc5b6-2970-4e7a-968b-634eae621c68 {{(pid=107505) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} Aug 30 14:21:26.183663 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] CMD "rbd import --pool vms /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e 6d52223d-efe0-461c-b964-611dc2f3103e_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.526s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:21:26.277972 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:21:26.309689 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] resizing rbd image 6d52223d-efe0-461c-b964-611dc2f3103e_disk to 1073741824 {{(pid=107505) resize /opt/stack/nova/nova/storage/rbd_utils.py:288}} Aug 30 14:21:26.432171 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Created local disks {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4851}} Aug 30 14:21:26.432752 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Ensure instance console log exists: /opt/stack/data/nova/instances/6d52223d-efe0-461c-b964-611dc2f3103e/console.log {{(pid=107505) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4603}} Aug 30 14:21:26.433448 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:21:26.434102 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:21:26.434517 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:21:26.556154 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-ce192f30-c42a-4ef7-ae12-57eddd80e51b tempest-DeleteServersAdminTestJSON-895773713 tempest-DeleteServersAdminTestJSON-895773713-project-member] Acquiring lock "977b8e8e-7768-4786-9a5a-78006565b459" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:21:27.063985 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.786s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:21:27.070698 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:21:27.090284 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:21:27.101909 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Successfully updated port: f91dc5b6-2970-4e7a-968b-634eae621c68 {{(pid=107505) _update_port /opt/stack/nova/nova/network/neutron.py:586}} Aug 30 14:21:27.118178 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Acquiring lock "refresh_cache-6d52223d-efe0-461c-b964-611dc2f3103e" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:21:27.118447 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Acquired lock "refresh_cache-6d52223d-efe0-461c-b964-611dc2f3103e" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:21:27.118814 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Building network info cache for instance {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} Aug 30 14:21:27.125416 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.632s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:21:27.126470 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: d08ac2a0-d12d-46e9-bf89-490056df3cb5] Start building networks asynchronously for instance. {{(pid=107505) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} Aug 30 14:21:27.134764 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 11.081s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:21:27.135214 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:21:27.135639 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Auditing locally available compute resources for np0035104604 (node: np0035104604) {{(pid=107505) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} Aug 30 14:21:27.136611 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:21:27.279599 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-35eebb2f-2bfd-4732-b569-f3db7b873a85 req-ec54a869-17fa-415e-8061-353569e9d0c5 service nova] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Received event network-changed-f91dc5b6-2970-4e7a-968b-634eae621c68 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:21:27.280298 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-35eebb2f-2bfd-4732-b569-f3db7b873a85 req-ec54a869-17fa-415e-8061-353569e9d0c5 service nova] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Refreshing instance network info cache due to event network-changed-f91dc5b6-2970-4e7a-968b-634eae621c68. {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11037}} Aug 30 14:21:27.280476 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-35eebb2f-2bfd-4732-b569-f3db7b873a85 req-ec54a869-17fa-415e-8061-353569e9d0c5 service nova] Acquiring lock "refresh_cache-6d52223d-efe0-461c-b964-611dc2f3103e" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:21:27.288767 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: d08ac2a0-d12d-46e9-bf89-490056df3cb5] Allocating IP information in the background. {{(pid=107505) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} Aug 30 14:21:27.289005 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: d08ac2a0-d12d-46e9-bf89-490056df3cb5] allocate_for_instance() {{(pid=107505) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} Aug 30 14:21:27.406008 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Instance cache missing network info. {{(pid=107505) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} Aug 30 14:21:27.411373 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: d08ac2a0-d12d-46e9-bf89-490056df3cb5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Aug 30 14:21:27.531738 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: d08ac2a0-d12d-46e9-bf89-490056df3cb5] Start building block device mappings for instance. {{(pid=107505) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} Aug 30 14:21:27.596593 np0035104604 nova-compute[107505]: DEBUG nova.policy [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fcbf43c19164442abf5d6e6303211228', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fb6a45f33bd94db2adee5047c736ad16', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=107505) authorize /opt/stack/nova/nova/policy.py:203}} Aug 30 14:21:28.002662 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.866s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:21:28.004534 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: d08ac2a0-d12d-46e9-bf89-490056df3cb5] Start spawning the instance on the hypervisor. {{(pid=107505) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} Aug 30 14:21:28.005797 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: d08ac2a0-d12d-46e9-bf89-490056df3cb5] Creating instance directory {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4717}} Aug 30 14:21:28.006366 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: d08ac2a0-d12d-46e9-bf89-490056df3cb5] Creating image(s) Aug 30 14:21:28.034537 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] rbd image d08ac2a0-d12d-46e9-bf89-490056df3cb5_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:21:28.064929 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] rbd image d08ac2a0-d12d-46e9-bf89-490056df3cb5_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:21:28.104577 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] rbd image d08ac2a0-d12d-46e9-bf89-490056df3cb5_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:21:28.108405 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e --force-share --output=json {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:21:28.182845 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] CMD "/opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e --force-share --output=json" returned: 0 in 0.074s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:21:28.183757 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Acquiring lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:21:28.184574 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:21:28.184934 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:21:28.211088 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] rbd image d08ac2a0-d12d-46e9-bf89-490056df3cb5_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:21:28.216788 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e d08ac2a0-d12d-46e9-bf89-490056df3cb5_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:21:28.395681 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: d08ac2a0-d12d-46e9-bf89-490056df3cb5] Successfully created port: ff801ea9-31be-43ef-a515-871e60302e1e {{(pid=107505) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} Aug 30 14:21:28.527494 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000012 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:21:28.527781 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000012 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:21:28.604066 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Aug 30 14:21:28.606910 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Hypervisor/Node resource view: name=np0035104604 free_ram=1777MB free_disk=29.92035675048828GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "00b8", "vendor_id": "1013", "numa_node": null, "label": "label_1013_00b8", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] {{(pid=107505) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} Aug 30 14:21:28.606910 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:21:28.607577 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:21:28.818474 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance 5db380b6-b80a-4cb1-b65c-571bfe9b4341 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:21:28.818722 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:21:28.818960 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance 977b8e8e-7768-4786-9a5a-78006565b459 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:21:28.819196 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance 6d52223d-efe0-461c-b964-611dc2f3103e actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:21:28.819443 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance d08ac2a0-d12d-46e9-bf89-490056df3cb5 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:21:28.819768 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Total usable vcpus: 8, total allocated vcpus: 5 {{(pid=107505) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} Aug 30 14:21:28.820041 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Final resource view: name=np0035104604 phys_ram=7746MB used_ram=1152MB phys_disk=29GB used_disk=5GB total_vcpus=8 used_vcpus=5 pci_stats=[] {{(pid=107505) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} Aug 30 14:21:28.955408 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:21:29.236027 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Updating instance_info_cache with network_info: [{"id": "f91dc5b6-2970-4e7a-968b-634eae621c68", "address": "fa:16:3e:18:0f:bf", "network": {"id": "9d601ad3-78bb-4369-a497-07df6036222a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-891047702-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "fb6a45f33bd94db2adee5047c736ad16", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf91dc5b6-29", "ovs_interfaceid": "f91dc5b6-2970-4e7a-968b-634eae621c68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:21:29.402789 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: d08ac2a0-d12d-46e9-bf89-490056df3cb5] Successfully updated port: ff801ea9-31be-43ef-a515-871e60302e1e {{(pid=107505) _update_port /opt/stack/nova/nova/network/neutron.py:586}} Aug 30 14:21:29.411142 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-ab7218c2-51d2-4a20-80ea-2f74c0a86d36 req-542be4c1-8936-48e8-9539-b7c1be288143 service nova] [instance: d08ac2a0-d12d-46e9-bf89-490056df3cb5] Received event network-changed-ff801ea9-31be-43ef-a515-871e60302e1e {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:21:29.411142 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-ab7218c2-51d2-4a20-80ea-2f74c0a86d36 req-542be4c1-8936-48e8-9539-b7c1be288143 service nova] [instance: d08ac2a0-d12d-46e9-bf89-490056df3cb5] Refreshing instance network info cache due to event network-changed-ff801ea9-31be-43ef-a515-871e60302e1e. {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11037}} Aug 30 14:21:29.411142 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-ab7218c2-51d2-4a20-80ea-2f74c0a86d36 req-542be4c1-8936-48e8-9539-b7c1be288143 service nova] Acquiring lock "refresh_cache-d08ac2a0-d12d-46e9-bf89-490056df3cb5" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:21:29.411142 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-ab7218c2-51d2-4a20-80ea-2f74c0a86d36 req-542be4c1-8936-48e8-9539-b7c1be288143 service nova] Acquired lock "refresh_cache-d08ac2a0-d12d-46e9-bf89-490056df3cb5" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:21:29.411142 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-ab7218c2-51d2-4a20-80ea-2f74c0a86d36 req-542be4c1-8936-48e8-9539-b7c1be288143 service nova] [instance: d08ac2a0-d12d-46e9-bf89-490056df3cb5] Refreshing network info cache for port ff801ea9-31be-43ef-a515-871e60302e1e {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} Aug 30 14:21:29.413057 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Releasing lock "refresh_cache-6d52223d-efe0-461c-b964-611dc2f3103e" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:21:29.413441 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Instance network_info: |[{"id": "f91dc5b6-2970-4e7a-968b-634eae621c68", "address": "fa:16:3e:18:0f:bf", "network": {"id": "9d601ad3-78bb-4369-a497-07df6036222a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-891047702-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "fb6a45f33bd94db2adee5047c736ad16", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf91dc5b6-29", "ovs_interfaceid": "f91dc5b6-2970-4e7a-968b-634eae621c68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=107505) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} Aug 30 14:21:29.417644 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-35eebb2f-2bfd-4732-b569-f3db7b873a85 req-ec54a869-17fa-415e-8061-353569e9d0c5 service nova] Acquired lock "refresh_cache-6d52223d-efe0-461c-b964-611dc2f3103e" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:21:29.417971 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-35eebb2f-2bfd-4732-b569-f3db7b873a85 req-ec54a869-17fa-415e-8061-353569e9d0c5 service nova] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Refreshing network info cache for port f91dc5b6-2970-4e7a-968b-634eae621c68 {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} Aug 30 14:21:29.421381 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Start _get_guest_xml network_info=[{"id": "f91dc5b6-2970-4e7a-968b-634eae621c68", "address": "fa:16:3e:18:0f:bf", "network": {"id": "9d601ad3-78bb-4369-a497-07df6036222a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-891047702-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "fb6a45f33bd94db2adee5047c736ad16", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf91dc5b6-29", "ovs_interfaceid": "f91dc5b6-2970-4e7a-968b-634eae621c68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'ide', 'dev': 'hda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-08-30T14:01:32Z,direct_url=,disk_format='qcow2',id=a32d892d-d3ba-4b7e-9bab-7e06f730b9e0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='7f1ed4be4a61468fa9c662d2b5fbcb56',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-08-30T14:01:43Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'disk_bus': 'virtio', 'image_id': 'a32d892d-d3ba-4b7e-9bab-7e06f730b9e0'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=107505) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7553}} Aug 30 14:21:29.426061 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Aug 30 14:21:29.428835 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Searching host: 'np0035104604' for CPU controller through CGroups V1... {{(pid=107505) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1650}} Aug 30 14:21:29.429388 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] CPU controller missing on host. {{(pid=107505) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1660}} Aug 30 14:21:29.431068 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Searching host: 'np0035104604' for CPU controller through CGroups V2... {{(pid=107505) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1669}} Aug 30 14:21:29.431565 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] CPU controller found on host. {{(pid=107505) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1676}} Aug 30 14:21:29.436706 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=107505) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5390}} Aug 30 14:21:29.437394 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-08-30T14:02:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-08-30T14:01:32Z,direct_url=,disk_format='qcow2',id=a32d892d-d3ba-4b7e-9bab-7e06f730b9e0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='7f1ed4be4a61468fa9c662d2b5fbcb56',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-08-30T14:01:43Z,virtual_size=,visibility=), allow threads: True {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Aug 30 14:21:29.437893 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Flavor limits 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Aug 30 14:21:29.438296 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Image limits 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Aug 30 14:21:29.438780 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Flavor pref 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Aug 30 14:21:29.439180 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Image pref 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Aug 30 14:21:29.439595 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Aug 30 14:21:29.440136 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Aug 30 14:21:29.440587 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=107505) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Aug 30 14:21:29.441054 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Got 1 possible topologies {{(pid=107505) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Aug 30 14:21:29.441526 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Aug 30 14:21:29.441986 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Aug 30 14:21:29.627191 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:21:29.991180 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-ab7218c2-51d2-4a20-80ea-2f74c0a86d36 req-542be4c1-8936-48e8-9539-b7c1be288143 service nova] [instance: d08ac2a0-d12d-46e9-bf89-490056df3cb5] Instance cache missing network info. {{(pid=107505) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} Aug 30 14:21:31.524093 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:21:31.524093 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-ab7218c2-51d2-4a20-80ea-2f74c0a86d36 req-542be4c1-8936-48e8-9539-b7c1be288143 service nova] [instance: d08ac2a0-d12d-46e9-bf89-490056df3cb5] Updating instance_info_cache with network_info: [] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:21:31.524093 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-ab7218c2-51d2-4a20-80ea-2f74c0a86d36 req-542be4c1-8936-48e8-9539-b7c1be288143 service nova] Releasing lock "refresh_cache-d08ac2a0-d12d-46e9-bf89-490056df3cb5" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:21:31.524093 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-35eebb2f-2bfd-4732-b569-f3db7b873a85 req-ec54a869-17fa-415e-8061-353569e9d0c5 service nova] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Updated VIF entry in instance network info cache for port f91dc5b6-2970-4e7a-968b-634eae621c68. {{(pid=107505) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} Aug 30 14:21:31.524093 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-35eebb2f-2bfd-4732-b569-f3db7b873a85 req-ec54a869-17fa-415e-8061-353569e9d0c5 service nova] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Updating instance_info_cache with network_info: [{"id": "f91dc5b6-2970-4e7a-968b-634eae621c68", "address": "fa:16:3e:18:0f:bf", "network": {"id": "9d601ad3-78bb-4369-a497-07df6036222a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-891047702-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "fb6a45f33bd94db2adee5047c736ad16", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf91dc5b6-29", "ovs_interfaceid": "f91dc5b6-2970-4e7a-968b-634eae621c68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:21:31.524093 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-35eebb2f-2bfd-4732-b569-f3db7b873a85 req-ec54a869-17fa-415e-8061-353569e9d0c5 service nova] Releasing lock "refresh_cache-6d52223d-efe0-461c-b964-611dc2f3103e" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:21:33.957416 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:21:35.438630 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Acquiring lock "refresh_cache-d08ac2a0-d12d-46e9-bf89-490056df3cb5" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:21:35.440418 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Acquired lock "refresh_cache-d08ac2a0-d12d-46e9-bf89-490056df3cb5" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:21:35.440418 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: d08ac2a0-d12d-46e9-bf89-490056df3cb5] Building network info cache for instance {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} Aug 30 14:21:35.604724 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 6.140s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:21:35.632401 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] rbd image 6d52223d-efe0-461c-b964-611dc2f3103e_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:21:35.641431 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:21:35.739744 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: d08ac2a0-d12d-46e9-bf89-490056df3cb5] Instance cache missing network info. {{(pid=107505) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} Aug 30 14:21:36.026847 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 5.575s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:21:36.035349 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:21:36.048009 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:21:36.185071 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Compute_service record updated for np0035104604:np0035104604 {{(pid=107505) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} Aug 30 14:21:36.185386 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 7.578s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:21:36.188974 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:21:36.189187 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Cleaning up deleted instances {{(pid=107505) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11129}} Aug 30 14:21:36.202722 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] There are 0 instances to clean {{(pid=107505) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11138}} Aug 30 14:21:36.203106 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:21:36.203377 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Cleaning up deleted instances with incomplete migration {{(pid=107505) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11167}} Aug 30 14:21:36.212928 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:21:36.357502 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.716s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:21:36.359085 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-08-30T14:20:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1187532059',display_name='tempest-tempest.common.compute-instance-1187532059-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-tempest-common-compute-instance-1187532059-1',id=21,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fb6a45f33bd94db2adee5047c736ad16',ramdisk_id='',reservation_id='r-lddn20nw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-1660104447',owner_user_name='tempest-MultipleCreateTestJSON-1660104447-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-08-30T14:21:25Z,user_data=None,user_id='fcbf43c19164442abf5d6e6303211228',uuid=6d52223d-efe0-461c-b964-611dc2f3103e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f91dc5b6-2970-4e7a-968b-634eae621c68", "address": "fa:16:3e:18:0f:bf", "network": {"id": "9d601ad3-78bb-4369-a497-07df6036222a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-891047702-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "fb6a45f33bd94db2adee5047c736ad16", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf91dc5b6-29", "ovs_interfaceid": "f91dc5b6-2970-4e7a-968b-634eae621c68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=qemu {{(pid=107505) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Aug 30 14:21:36.359577 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Converting VIF {"id": "f91dc5b6-2970-4e7a-968b-634eae621c68", "address": "fa:16:3e:18:0f:bf", "network": {"id": "9d601ad3-78bb-4369-a497-07df6036222a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-891047702-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "fb6a45f33bd94db2adee5047c736ad16", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf91dc5b6-29", "ovs_interfaceid": "f91dc5b6-2970-4e7a-968b-634eae621c68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:21:36.360726 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:18:0f:bf,bridge_name='br-int',has_traffic_filtering=True,id=f91dc5b6-2970-4e7a-968b-634eae621c68,network=Network(9d601ad3-78bb-4369-a497-07df6036222a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf91dc5b6-29') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:21:36.362046 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Lazy-loading 'pci_devices' on Instance uuid 6d52223d-efe0-461c-b964-611dc2f3103e {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] End _get_guest_xml xml= Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: 6d52223d-efe0-461c-b964-611dc2f3103e Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: instance-00000015 Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: 131072 Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: 1 Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: tempest-tempest.common.compute-instance-1187532059-1 Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: 2023-08-30 14:21:29 Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: 128 Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: 1 Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: 0 Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: 0 Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: 1 Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: tempest-MultipleCreateTestJSON-1660104447-project-member Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: tempest-MultipleCreateTestJSON-1660104447 Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: OpenStack Foundation Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: OpenStack Nova Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: 27.1.0 Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: 6d52223d-efe0-461c-b964-611dc2f3103e Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: 6d52223d-efe0-461c-b964-611dc2f3103e Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Virtual Machine Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: hvm Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Nehalem Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: /dev/urandom Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: Aug 30 14:21:36.375839 np0035104604 nova-compute[107505]: {{(pid=107505) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7559}} Aug 30 14:21:36.386368 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Preparing to wait for external event network-vif-plugged-f91dc5b6-2970-4e7a-968b-634eae621c68 {{(pid=107505) prepare_for_instance_event /opt/stack/nova/nova/compute/manager.py:283}} Aug 30 14:21:36.386368 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Acquiring lock "6d52223d-efe0-461c-b964-611dc2f3103e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:21:36.386368 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Lock "6d52223d-efe0-461c-b964-611dc2f3103e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:21:36.386368 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Lock "6d52223d-efe0-461c-b964-611dc2f3103e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:21:36.386368 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-08-30T14:20:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1187532059',display_name='tempest-tempest.common.compute-instance-1187532059-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-tempest-common-compute-instance-1187532059-1',id=21,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fb6a45f33bd94db2adee5047c736ad16',ramdisk_id='',reservation_id='r-lddn20nw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-1660104447',owner_user_name='tempest-MultipleCreateTestJSON-1660104447-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-08-30T14:21:25Z,user_data=None,user_id='fcbf43c19164442abf5d6e6303211228',uuid=6d52223d-efe0-461c-b964-611dc2f3103e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f91dc5b6-2970-4e7a-968b-634eae621c68", "address": "fa:16:3e:18:0f:bf", "network": {"id": "9d601ad3-78bb-4369-a497-07df6036222a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-891047702-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "fb6a45f33bd94db2adee5047c736ad16", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf91dc5b6-29", "ovs_interfaceid": "f91dc5b6-2970-4e7a-968b-634eae621c68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Aug 30 14:21:36.386869 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Converting VIF {"id": "f91dc5b6-2970-4e7a-968b-634eae621c68", "address": "fa:16:3e:18:0f:bf", "network": {"id": "9d601ad3-78bb-4369-a497-07df6036222a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-891047702-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "fb6a45f33bd94db2adee5047c736ad16", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf91dc5b6-29", "ovs_interfaceid": "f91dc5b6-2970-4e7a-968b-634eae621c68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:21:36.386869 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:18:0f:bf,bridge_name='br-int',has_traffic_filtering=True,id=f91dc5b6-2970-4e7a-968b-634eae621c68,network=Network(9d601ad3-78bb-4369-a497-07df6036222a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf91dc5b6-29') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:21:36.386869 np0035104604 nova-compute[107505]: DEBUG os_vif [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:18:0f:bf,bridge_name='br-int',has_traffic_filtering=True,id=f91dc5b6-2970-4e7a-968b-634eae621c68,network=Network(9d601ad3-78bb-4369-a497-07df6036222a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf91dc5b6-29') {{(pid=107505) plug /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:76}} Aug 30 14:21:36.386869 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:21:36.386869 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:21:36.386869 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Aug 30 14:21:36.389172 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:21:36.389545 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf91dc5b6-29, may_exist=True, interface_attrs={}) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:21:36.390314 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf91dc5b6-29, col_values=(('external_ids', {'iface-id': 'f91dc5b6-2970-4e7a-968b-634eae621c68', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:18:0f:bf', 'vm-uuid': '6d52223d-efe0-461c-b964-611dc2f3103e'}),), if_exists=True) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:21:36.392582 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:21:36.396667 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:21:36.402664 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:21:36.404259 np0035104604 nova-compute[107505]: INFO os_vif [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:18:0f:bf,bridge_name='br-int',has_traffic_filtering=True,id=f91dc5b6-2970-4e7a-968b-634eae621c68,network=Network(9d601ad3-78bb-4369-a497-07df6036222a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf91dc5b6-29') Aug 30 14:21:36.552162 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] No BDM found with device name vda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:21:36.552945 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] No BDM found with device name hda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:21:36.552945 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] No VIF found with MAC fa:16:3e:18:0f:bf, not building metadata {{(pid=107505) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12068}} Aug 30 14:21:36.553462 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Using config drive Aug 30 14:21:36.579620 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] rbd image 6d52223d-efe0-461c-b964-611dc2f3103e_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:21:36.587922 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: d08ac2a0-d12d-46e9-bf89-490056df3cb5] Updating instance_info_cache with network_info: [{"id": "ff801ea9-31be-43ef-a515-871e60302e1e", "address": "fa:16:3e:95:52:44", "network": {"id": "9d601ad3-78bb-4369-a497-07df6036222a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-891047702-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "fb6a45f33bd94db2adee5047c736ad16", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapff801ea9-31", "ovs_interfaceid": "ff801ea9-31be-43ef-a515-871e60302e1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:21:36.719654 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Releasing lock "refresh_cache-d08ac2a0-d12d-46e9-bf89-490056df3cb5" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:21:36.720162 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: d08ac2a0-d12d-46e9-bf89-490056df3cb5] Instance network_info: |[{"id": "ff801ea9-31be-43ef-a515-871e60302e1e", "address": "fa:16:3e:95:52:44", "network": {"id": "9d601ad3-78bb-4369-a497-07df6036222a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-891047702-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "fb6a45f33bd94db2adee5047c736ad16", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapff801ea9-31", "ovs_interfaceid": "ff801ea9-31be-43ef-a515-871e60302e1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=107505) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} Aug 30 14:21:36.914860 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Creating config drive at /opt/stack/data/nova/instances/6d52223d-efe0-461c-b964-611dc2f3103e/disk.config Aug 30 14:21:36.918778 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Running cmd (subprocess): genisoimage -o /opt/stack/data/nova/instances/6d52223d-efe0-461c-b964-611dc2f3103e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 27.1.0 -quiet -J -r -V config-2 /tmp/tmpb4lskbl1 {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:21:36.949939 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] CMD "genisoimage -o /opt/stack/data/nova/instances/6d52223d-efe0-461c-b964-611dc2f3103e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 27.1.0 -quiet -J -r -V config-2 /tmp/tmpb4lskbl1" returned: 0 in 0.029s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:21:36.985102 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] rbd image 6d52223d-efe0-461c-b964-611dc2f3103e_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:21:36.990176 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/6d52223d-efe0-461c-b964-611dc2f3103e/disk.config 6d52223d-efe0-461c-b964-611dc2f3103e_disk.config --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:21:37.125989 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] CMD "rbd import --pool vms /opt/stack/data/nova/instances/6d52223d-efe0-461c-b964-611dc2f3103e/disk.config 6d52223d-efe0-461c-b964-611dc2f3103e_disk.config --image-format=2 --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.135s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:21:37.126517 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Deleting local config drive /opt/stack/data/nova/instances/6d52223d-efe0-461c-b964-611dc2f3103e/disk.config because it was imported into RBD. Aug 30 14:21:37.155714 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:21:37.168851 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:21:37.180821 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:21:37.530598 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:21:37.536131 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:21:37.542682 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:21:37.553232 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:21:37.559067 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:21:37.653650 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-917d9a52-1b15-4090-95b2-ebf28500471d req-692c8443-e13b-47ff-a8f4-8dc3238d0c55 service nova] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Received event network-vif-plugged-f91dc5b6-2970-4e7a-968b-634eae621c68 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:21:37.654240 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-917d9a52-1b15-4090-95b2-ebf28500471d req-692c8443-e13b-47ff-a8f4-8dc3238d0c55 service nova] Acquiring lock "6d52223d-efe0-461c-b964-611dc2f3103e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:21:37.654626 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-917d9a52-1b15-4090-95b2-ebf28500471d req-692c8443-e13b-47ff-a8f4-8dc3238d0c55 service nova] Lock "6d52223d-efe0-461c-b964-611dc2f3103e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:21:37.655055 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-917d9a52-1b15-4090-95b2-ebf28500471d req-692c8443-e13b-47ff-a8f4-8dc3238d0c55 service nova] Lock "6d52223d-efe0-461c-b964-611dc2f3103e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:21:37.655474 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-917d9a52-1b15-4090-95b2-ebf28500471d req-692c8443-e13b-47ff-a8f4-8dc3238d0c55 service nova] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Processing event network-vif-plugged-f91dc5b6-2970-4e7a-968b-634eae621c68 {{(pid=107505) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10792}} Aug 30 14:21:37.980561 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Started> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:21:37.980775 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] VM Started (Lifecycle Event) Aug 30 14:21:37.984266 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Instance event wait completed in 0 seconds for network-vif-plugged {{(pid=107505) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Aug 30 14:21:38.000332 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:21:38.001023 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Guest created on hypervisor {{(pid=107505) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4411}} Aug 30 14:21:38.005827 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:21:38.009929 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Instance spawned successfully. Aug 30 14:21:38.010462 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:894}} Aug 30 14:21:38.023131 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] During sync_power_state the instance has a pending task (spawning). Skip. Aug 30 14:21:38.024180 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Paused> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:21:38.024598 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] VM Paused (Lifecycle Event) Aug 30 14:21:38.039813 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:21:38.045540 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Found default for hw_cdrom_bus of ide {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:21:38.046021 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Found default for hw_disk_bus of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:21:38.046891 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Found default for hw_input_bus of None {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:21:38.047606 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Found default for hw_pointer_model of None {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:21:38.048491 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Found default for hw_video_model of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:21:38.049330 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Found default for hw_vif_model of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:21:38.056794 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Resumed> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:21:38.057054 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] VM Resumed (Lifecycle Event) Aug 30 14:21:38.080327 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:21:38.085256 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:21:38.114947 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] During sync_power_state the instance has a pending task (spawning). Skip. Aug 30 14:21:38.137699 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Took 12.83 seconds to spawn the instance on the hypervisor. Aug 30 14:21:38.138307 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:21:38.220114 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Took 39.65 seconds to build instance. Aug 30 14:21:38.238301 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-2b5b02b1-782c-45a5-8cdf-70d0bf0a0694 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Lock "6d52223d-efe0-461c-b964-611dc2f3103e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 39.777s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:21:38.886748 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:21:39.220564 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:21:39.221422 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:21:39.222125 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:21:39.222752 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Starting heal instance info cache {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9842}} Aug 30 14:21:39.223341 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Rebuilding the list of instances to heal {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9846}} Aug 30 14:21:39.250584 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: 5db380b6-b80a-4cb1-b65c-571bfe9b4341] Skipping network cache update for instance because it is Building. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9855}} Aug 30 14:21:39.251382 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75] Skipping network cache update for instance because it is Building. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9855}} Aug 30 14:21:39.251999 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: 977b8e8e-7768-4786-9a5a-78006565b459] Skipping network cache update for instance because it is Building. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9855}} Aug 30 14:21:39.252630 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: d08ac2a0-d12d-46e9-bf89-490056df3cb5] Skipping network cache update for instance because it is Building. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9855}} Aug 30 14:21:39.455236 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "refresh_cache-6d52223d-efe0-461c-b964-611dc2f3103e" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:21:39.455587 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquired lock "refresh_cache-6d52223d-efe0-461c-b964-611dc2f3103e" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:21:39.455967 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Forcefully refreshing network info cache for instance {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1996}} Aug 30 14:21:39.456361 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lazy-loading 'info_cache' on Instance uuid 6d52223d-efe0-461c-b964-611dc2f3103e {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:21:39.838775 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-0fb8ae9d-9625-49f6-a982-4761d019ea3b req-343747ed-0776-40c7-ad64-205366bb31f4 service nova] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Received event network-vif-plugged-f91dc5b6-2970-4e7a-968b-634eae621c68 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:21:39.839051 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-0fb8ae9d-9625-49f6-a982-4761d019ea3b req-343747ed-0776-40c7-ad64-205366bb31f4 service nova] Acquiring lock "6d52223d-efe0-461c-b964-611dc2f3103e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:21:39.839429 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-0fb8ae9d-9625-49f6-a982-4761d019ea3b req-343747ed-0776-40c7-ad64-205366bb31f4 service nova] Lock "6d52223d-efe0-461c-b964-611dc2f3103e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:21:39.839711 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-0fb8ae9d-9625-49f6-a982-4761d019ea3b req-343747ed-0776-40c7-ad64-205366bb31f4 service nova] Lock "6d52223d-efe0-461c-b964-611dc2f3103e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:21:39.840071 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-0fb8ae9d-9625-49f6-a982-4761d019ea3b req-343747ed-0776-40c7-ad64-205366bb31f4 service nova] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] No waiting events found dispatching network-vif-plugged-f91dc5b6-2970-4e7a-968b-634eae621c68 {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:21:39.840352 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [req-0fb8ae9d-9625-49f6-a982-4761d019ea3b req-343747ed-0776-40c7-ad64-205366bb31f4 service nova] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Received unexpected event network-vif-plugged-f91dc5b6-2970-4e7a-968b-634eae621c68 for instance with vm_state active and task_state None. Aug 30 14:21:40.834811 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Updating instance_info_cache with network_info: [{"id": "f91dc5b6-2970-4e7a-968b-634eae621c68", "address": "fa:16:3e:18:0f:bf", "network": {"id": "9d601ad3-78bb-4369-a497-07df6036222a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-891047702-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "fb6a45f33bd94db2adee5047c736ad16", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf91dc5b6-29", "ovs_interfaceid": "f91dc5b6-2970-4e7a-968b-634eae621c68", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:21:40.854340 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Releasing lock "refresh_cache-6d52223d-efe0-461c-b964-611dc2f3103e" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:21:40.854833 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Updated the network info_cache for instance {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} Aug 30 14:21:40.855964 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:21:40.856489 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:21:40.856945 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:21:40.857479 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:21:40.858107 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:21:40.858482 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:21:40.858738 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=107505) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10461}} Aug 30 14:21:41.394781 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:21:43.888624 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:21:46.398668 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:21:51.402613 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:21:53.866948 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._sync_power_states {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:21:53.892645 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:21:53.897301 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] While synchronizing instance power states, found 5 instances in the database and 2 instances on the hypervisor. Aug 30 14:21:53.897519 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Sync already in progress for 5db380b6-b80a-4cb1-b65c-571bfe9b4341 {{(pid=107505) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10250}} Aug 30 14:21:53.897784 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Triggering sync for uuid f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75 {{(pid=107505) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10252}} Aug 30 14:21:53.898248 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Triggering sync for uuid 977b8e8e-7768-4786-9a5a-78006565b459 {{(pid=107505) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10252}} Aug 30 14:21:53.898564 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Triggering sync for uuid 6d52223d-efe0-461c-b964-611dc2f3103e {{(pid=107505) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10252}} Aug 30 14:21:53.898828 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Triggering sync for uuid d08ac2a0-d12d-46e9-bf89-490056df3cb5 {{(pid=107505) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10252}} Aug 30 14:21:53.899612 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:21:53.899984 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "977b8e8e-7768-4786-9a5a-78006565b459" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:21:53.900309 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "6d52223d-efe0-461c-b964-611dc2f3103e" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:21:53.900638 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "6d52223d-efe0-461c-b964-611dc2f3103e" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:21:53.900992 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "d08ac2a0-d12d-46e9-bf89-490056df3cb5" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:21:53.923461 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "6d52223d-efe0-461c-b964-611dc2f3103e" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.023s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:21:56.406256 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:22:01.410530 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:22:06.414281 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:22:11.418660 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:22:13.895554 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:22:16.422850 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:22:17.025641 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager.update_available_resource {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:22:17.051219 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:22:17.051219 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:22:17.051219 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:22:17.051667 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Auditing locally available compute resources for np0035104604 (node: np0035104604) {{(pid=107505) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} Aug 30 14:22:17.051885 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:22:17.626623 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.574s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:22:17.711702 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000015 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:22:17.712062 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000015 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:22:17.716689 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000012 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:22:17.716689 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000012 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:22:17.809845 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Aug 30 14:22:17.811145 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Hypervisor/Node resource view: name=np0035104604 free_ram=1364MB free_disk=29.85552215576172GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "00b8", "vendor_id": "1013", "numa_node": null, "label": "label_1013_00b8", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] {{(pid=107505) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} Aug 30 14:22:17.811445 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:22:17.811865 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:22:18.169305 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance 5db380b6-b80a-4cb1-b65c-571bfe9b4341 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:22:18.169613 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:22:18.169931 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance 977b8e8e-7768-4786-9a5a-78006565b459 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:22:18.170227 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance 6d52223d-efe0-461c-b964-611dc2f3103e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:22:18.170517 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance d08ac2a0-d12d-46e9-bf89-490056df3cb5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:22:18.170930 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Total usable vcpus: 8, total allocated vcpus: 5 {{(pid=107505) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} Aug 30 14:22:18.171271 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Final resource view: name=np0035104604 phys_ram=7746MB used_ram=1152MB phys_disk=29GB used_disk=5GB total_vcpus=8 used_vcpus=5 pci_stats=[] {{(pid=107505) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} Aug 30 14:22:18.362791 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Refreshing inventories for resource provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} Aug 30 14:22:18.543305 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Updating ProviderTree inventory for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} Aug 30 14:22:18.543788 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Updating inventory in ProviderTree for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Aug 30 14:22:18.699688 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Refreshing aggregate associations for resource provider 600ab55f-530c-4be6-bf02-067d68ce7ee4, aggregates: None {{(pid=107505) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} Aug 30 14:22:18.887351 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Refreshing trait associations for resource provider 600ab55f-530c-4be6-bf02-067d68ce7ee4, traits: COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_QXL,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE42,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_LAN9118,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_RESCUE_BFV,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VMVGA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_SMMUV3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SCSI {{(pid=107505) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} Aug 30 14:22:19.009381 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:22:19.772868 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:22:20.346043 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.572s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:22:20.353160 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:22:20.370574 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:22:20.408971 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Compute_service record updated for np0035104604:np0035104604 {{(pid=107505) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} Aug 30 14:22:20.409569 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.597s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:22:21.425585 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:22:22.690640 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:22:23.899350 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:22:24.410627 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:22:24.411108 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Starting heal instance info cache {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9842}} Aug 30 14:22:24.411445 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Rebuilding the list of instances to heal {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9846}} Aug 30 14:22:24.436490 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: 5db380b6-b80a-4cb1-b65c-571bfe9b4341] Skipping network cache update for instance because it is Building. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9855}} Aug 30 14:22:24.437021 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75] Skipping network cache update for instance because it is Building. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9855}} Aug 30 14:22:24.437398 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: 977b8e8e-7768-4786-9a5a-78006565b459] Skipping network cache update for instance because it is Building. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9855}} Aug 30 14:22:24.437843 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: d08ac2a0-d12d-46e9-bf89-490056df3cb5] Skipping network cache update for instance because it is Building. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9855}} Aug 30 14:22:24.632469 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "refresh_cache-6d52223d-efe0-461c-b964-611dc2f3103e" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:22:24.632710 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquired lock "refresh_cache-6d52223d-efe0-461c-b964-611dc2f3103e" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:22:24.633085 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Forcefully refreshing network info cache for instance {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1996}} Aug 30 14:22:24.633289 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lazy-loading 'info_cache' on Instance uuid 6d52223d-efe0-461c-b964-611dc2f3103e {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:22:25.797675 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:22:25.995011 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Updating instance_info_cache with network_info: [{"id": "f91dc5b6-2970-4e7a-968b-634eae621c68", "address": "fa:16:3e:18:0f:bf", "network": {"id": "9d601ad3-78bb-4369-a497-07df6036222a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-891047702-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "fb6a45f33bd94db2adee5047c736ad16", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf91dc5b6-29", "ovs_interfaceid": "f91dc5b6-2970-4e7a-968b-634eae621c68", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:22:26.010109 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Releasing lock "refresh_cache-6d52223d-efe0-461c-b964-611dc2f3103e" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:22:26.010376 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Updated the network info_cache for instance {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} Aug 30 14:22:26.010987 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:22:26.011242 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:22:26.011502 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:22:26.011763 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:22:26.012000 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:22:26.012245 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:22:26.012504 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=107505) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10461}} Aug 30 14:22:26.428819 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:22:26.619372 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:22:28.901798 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:22:31.431876 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:22:33.902959 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:22:34.136193 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:22:36.389384 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:22:36.433593 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:22:38.904881 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:22:39.192564 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-22337a16-3147-4f0f-b763-f1569f1c5e40 tempest-ImagesOneServerTestJSON-427126162 tempest-ImagesOneServerTestJSON-427126162-project-member] Acquiring lock "80874308-ea97-470f-b592-2c218f708ac6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:22:39.193072 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-22337a16-3147-4f0f-b763-f1569f1c5e40 tempest-ImagesOneServerTestJSON-427126162 tempest-ImagesOneServerTestJSON-427126162-project-member] Lock "80874308-ea97-470f-b592-2c218f708ac6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:22:39.214312 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-22337a16-3147-4f0f-b763-f1569f1c5e40 tempest-ImagesOneServerTestJSON-427126162 tempest-ImagesOneServerTestJSON-427126162-project-member] [instance: 80874308-ea97-470f-b592-2c218f708ac6] Starting instance... {{(pid=107505) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} Aug 30 14:22:39.450936 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-22337a16-3147-4f0f-b763-f1569f1c5e40 tempest-ImagesOneServerTestJSON-427126162 tempest-ImagesOneServerTestJSON-427126162-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:22:39.451160 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-22337a16-3147-4f0f-b763-f1569f1c5e40 tempest-ImagesOneServerTestJSON-427126162 tempest-ImagesOneServerTestJSON-427126162-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:22:39.457600 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-22337a16-3147-4f0f-b763-f1569f1c5e40 tempest-ImagesOneServerTestJSON-427126162 tempest-ImagesOneServerTestJSON-427126162-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=107505) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Aug 30 14:22:39.457933 np0035104604 nova-compute[107505]: INFO nova.compute.claims [None req-22337a16-3147-4f0f-b763-f1569f1c5e40 tempest-ImagesOneServerTestJSON-427126162 tempest-ImagesOneServerTestJSON-427126162-project-member] [instance: 80874308-ea97-470f-b592-2c218f708ac6] Claim successful on node np0035104604 Aug 30 14:22:40.611778 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-22337a16-3147-4f0f-b763-f1569f1c5e40 tempest-ImagesOneServerTestJSON-427126162 tempest-ImagesOneServerTestJSON-427126162-project-member] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:22:41.261734 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-22337a16-3147-4f0f-b763-f1569f1c5e40 tempest-ImagesOneServerTestJSON-427126162 tempest-ImagesOneServerTestJSON-427126162-project-member] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.650s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:22:41.267879 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-22337a16-3147-4f0f-b763-f1569f1c5e40 tempest-ImagesOneServerTestJSON-427126162 tempest-ImagesOneServerTestJSON-427126162-project-member] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:22:41.285713 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-22337a16-3147-4f0f-b763-f1569f1c5e40 tempest-ImagesOneServerTestJSON-427126162 tempest-ImagesOneServerTestJSON-427126162-project-member] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:22:41.324106 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-22337a16-3147-4f0f-b763-f1569f1c5e40 tempest-ImagesOneServerTestJSON-427126162 tempest-ImagesOneServerTestJSON-427126162-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.873s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:22:41.325018 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-22337a16-3147-4f0f-b763-f1569f1c5e40 tempest-ImagesOneServerTestJSON-427126162 tempest-ImagesOneServerTestJSON-427126162-project-member] [instance: 80874308-ea97-470f-b592-2c218f708ac6] Start building networks asynchronously for instance. {{(pid=107505) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} Aug 30 14:22:41.399232 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-22337a16-3147-4f0f-b763-f1569f1c5e40 tempest-ImagesOneServerTestJSON-427126162 tempest-ImagesOneServerTestJSON-427126162-project-member] [instance: 80874308-ea97-470f-b592-2c218f708ac6] Allocating IP information in the background. {{(pid=107505) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} Aug 30 14:22:41.399797 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-22337a16-3147-4f0f-b763-f1569f1c5e40 tempest-ImagesOneServerTestJSON-427126162 tempest-ImagesOneServerTestJSON-427126162-project-member] [instance: 80874308-ea97-470f-b592-2c218f708ac6] allocate_for_instance() {{(pid=107505) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} Aug 30 14:22:41.530842 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:22:41.532628 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-22337a16-3147-4f0f-b763-f1569f1c5e40 tempest-ImagesOneServerTestJSON-427126162 tempest-ImagesOneServerTestJSON-427126162-project-member] [instance: 80874308-ea97-470f-b592-2c218f708ac6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Aug 30 14:22:41.556516 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-22337a16-3147-4f0f-b763-f1569f1c5e40 tempest-ImagesOneServerTestJSON-427126162 tempest-ImagesOneServerTestJSON-427126162-project-member] [instance: 80874308-ea97-470f-b592-2c218f708ac6] Start building block device mappings for instance. {{(pid=107505) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} Aug 30 14:22:41.641981 np0035104604 nova-compute[107505]: DEBUG nova.policy [None req-22337a16-3147-4f0f-b763-f1569f1c5e40 tempest-ImagesOneServerTestJSON-427126162 tempest-ImagesOneServerTestJSON-427126162-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9360b47ffe0e4a588ed46ae7c901da0e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '713b193a8389489d9ecbb1114d3cddb1', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=107505) authorize /opt/stack/nova/nova/policy.py:203}} Aug 30 14:22:41.771264 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-22337a16-3147-4f0f-b763-f1569f1c5e40 tempest-ImagesOneServerTestJSON-427126162 tempest-ImagesOneServerTestJSON-427126162-project-member] [instance: 80874308-ea97-470f-b592-2c218f708ac6] Start spawning the instance on the hypervisor. {{(pid=107505) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} Aug 30 14:22:41.772572 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-22337a16-3147-4f0f-b763-f1569f1c5e40 tempest-ImagesOneServerTestJSON-427126162 tempest-ImagesOneServerTestJSON-427126162-project-member] [instance: 80874308-ea97-470f-b592-2c218f708ac6] Creating instance directory {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4717}} Aug 30 14:22:41.773156 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-22337a16-3147-4f0f-b763-f1569f1c5e40 tempest-ImagesOneServerTestJSON-427126162 tempest-ImagesOneServerTestJSON-427126162-project-member] [instance: 80874308-ea97-470f-b592-2c218f708ac6] Creating image(s) Aug 30 14:22:41.803073 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-22337a16-3147-4f0f-b763-f1569f1c5e40 tempest-ImagesOneServerTestJSON-427126162 tempest-ImagesOneServerTestJSON-427126162-project-member] rbd image 80874308-ea97-470f-b592-2c218f708ac6_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:22:41.836894 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-22337a16-3147-4f0f-b763-f1569f1c5e40 tempest-ImagesOneServerTestJSON-427126162 tempest-ImagesOneServerTestJSON-427126162-project-member] rbd image 80874308-ea97-470f-b592-2c218f708ac6_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:22:41.872427 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-22337a16-3147-4f0f-b763-f1569f1c5e40 tempest-ImagesOneServerTestJSON-427126162 tempest-ImagesOneServerTestJSON-427126162-project-member] rbd image 80874308-ea97-470f-b592-2c218f708ac6_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:22:41.876839 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-22337a16-3147-4f0f-b763-f1569f1c5e40 tempest-ImagesOneServerTestJSON-427126162 tempest-ImagesOneServerTestJSON-427126162-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e --force-share --output=json {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:22:44.755829 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:22:44.762643 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-22337a16-3147-4f0f-b763-f1569f1c5e40 tempest-ImagesOneServerTestJSON-427126162 tempest-ImagesOneServerTestJSON-427126162-project-member] CMD "/opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e --force-share --output=json" returned: 0 in 2.886s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:22:44.763825 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-22337a16-3147-4f0f-b763-f1569f1c5e40 tempest-ImagesOneServerTestJSON-427126162 tempest-ImagesOneServerTestJSON-427126162-project-member] Acquiring lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:22:44.765470 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-22337a16-3147-4f0f-b763-f1569f1c5e40 tempest-ImagesOneServerTestJSON-427126162 tempest-ImagesOneServerTestJSON-427126162-project-member] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.002s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:22:44.766173 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-22337a16-3147-4f0f-b763-f1569f1c5e40 tempest-ImagesOneServerTestJSON-427126162 tempest-ImagesOneServerTestJSON-427126162-project-member] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:22:47.814073 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-22337a16-3147-4f0f-b763-f1569f1c5e40 tempest-ImagesOneServerTestJSON-427126162 tempest-ImagesOneServerTestJSON-427126162-project-member] rbd image 80874308-ea97-470f-b592-2c218f708ac6_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:22:47.818021 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-22337a16-3147-4f0f-b763-f1569f1c5e40 tempest-ImagesOneServerTestJSON-427126162 tempest-ImagesOneServerTestJSON-427126162-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e 80874308-ea97-470f-b592-2c218f708ac6_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:22:47.843136 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:22:48.388142 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-22337a16-3147-4f0f-b763-f1569f1c5e40 tempest-ImagesOneServerTestJSON-427126162 tempest-ImagesOneServerTestJSON-427126162-project-member] [instance: 80874308-ea97-470f-b592-2c218f708ac6] Successfully created port: e551f013-8b8f-4f0b-a55d-53b59513d3d3 {{(pid=107505) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} Aug 30 14:22:49.150168 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-22337a16-3147-4f0f-b763-f1569f1c5e40 tempest-ImagesOneServerTestJSON-427126162 tempest-ImagesOneServerTestJSON-427126162-project-member] [instance: 80874308-ea97-470f-b592-2c218f708ac6] Successfully updated port: e551f013-8b8f-4f0b-a55d-53b59513d3d3 {{(pid=107505) _update_port /opt/stack/nova/nova/network/neutron.py:586}} Aug 30 14:22:49.167606 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-22337a16-3147-4f0f-b763-f1569f1c5e40 tempest-ImagesOneServerTestJSON-427126162 tempest-ImagesOneServerTestJSON-427126162-project-member] Acquiring lock "refresh_cache-80874308-ea97-470f-b592-2c218f708ac6" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:22:49.167871 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-22337a16-3147-4f0f-b763-f1569f1c5e40 tempest-ImagesOneServerTestJSON-427126162 tempest-ImagesOneServerTestJSON-427126162-project-member] Acquired lock "refresh_cache-80874308-ea97-470f-b592-2c218f708ac6" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:22:49.168335 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-22337a16-3147-4f0f-b763-f1569f1c5e40 tempest-ImagesOneServerTestJSON-427126162 tempest-ImagesOneServerTestJSON-427126162-project-member] [instance: 80874308-ea97-470f-b592-2c218f708ac6] Building network info cache for instance {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} Aug 30 14:22:49.358806 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-22337a16-3147-4f0f-b763-f1569f1c5e40 tempest-ImagesOneServerTestJSON-427126162 tempest-ImagesOneServerTestJSON-427126162-project-member] [instance: 80874308-ea97-470f-b592-2c218f708ac6] Instance cache missing network info. {{(pid=107505) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} Aug 30 14:22:49.479587 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-7ff421fb-bd3d-4025-b0ee-f6585571832b req-27e59f5a-faab-4484-a4fe-e9992961884d service nova] [instance: 80874308-ea97-470f-b592-2c218f708ac6] Received event network-changed-e551f013-8b8f-4f0b-a55d-53b59513d3d3 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:22:49.479857 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-7ff421fb-bd3d-4025-b0ee-f6585571832b req-27e59f5a-faab-4484-a4fe-e9992961884d service nova] [instance: 80874308-ea97-470f-b592-2c218f708ac6] Refreshing instance network info cache due to event network-changed-e551f013-8b8f-4f0b-a55d-53b59513d3d3. {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11037}} Aug 30 14:22:49.480183 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-7ff421fb-bd3d-4025-b0ee-f6585571832b req-27e59f5a-faab-4484-a4fe-e9992961884d service nova] Acquiring lock "refresh_cache-80874308-ea97-470f-b592-2c218f708ac6" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:22:50.173228 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-22337a16-3147-4f0f-b763-f1569f1c5e40 tempest-ImagesOneServerTestJSON-427126162 tempest-ImagesOneServerTestJSON-427126162-project-member] [instance: 80874308-ea97-470f-b592-2c218f708ac6] Updating instance_info_cache with network_info: [{"id": "e551f013-8b8f-4f0b-a55d-53b59513d3d3", "address": "fa:16:3e:79:b1:79", "network": {"id": "6bc6f19c-22ac-401a-a1d9-b1741c6fc4b7", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1173843448-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "713b193a8389489d9ecbb1114d3cddb1", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape551f013-8b", "ovs_interfaceid": "e551f013-8b8f-4f0b-a55d-53b59513d3d3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:22:50.191672 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-22337a16-3147-4f0f-b763-f1569f1c5e40 tempest-ImagesOneServerTestJSON-427126162 tempest-ImagesOneServerTestJSON-427126162-project-member] Releasing lock "refresh_cache-80874308-ea97-470f-b592-2c218f708ac6" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:22:50.192365 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-22337a16-3147-4f0f-b763-f1569f1c5e40 tempest-ImagesOneServerTestJSON-427126162 tempest-ImagesOneServerTestJSON-427126162-project-member] [instance: 80874308-ea97-470f-b592-2c218f708ac6] Instance network_info: |[{"id": "e551f013-8b8f-4f0b-a55d-53b59513d3d3", "address": "fa:16:3e:79:b1:79", "network": {"id": "6bc6f19c-22ac-401a-a1d9-b1741c6fc4b7", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1173843448-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "713b193a8389489d9ecbb1114d3cddb1", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape551f013-8b", "ovs_interfaceid": "e551f013-8b8f-4f0b-a55d-53b59513d3d3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=107505) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} Aug 30 14:22:50.193437 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-7ff421fb-bd3d-4025-b0ee-f6585571832b req-27e59f5a-faab-4484-a4fe-e9992961884d service nova] Acquired lock "refresh_cache-80874308-ea97-470f-b592-2c218f708ac6" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:22:50.193906 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-7ff421fb-bd3d-4025-b0ee-f6585571832b req-27e59f5a-faab-4484-a4fe-e9992961884d service nova] [instance: 80874308-ea97-470f-b592-2c218f708ac6] Refreshing network info cache for port e551f013-8b8f-4f0b-a55d-53b59513d3d3 {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} Aug 30 14:22:51.339497 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-7ff421fb-bd3d-4025-b0ee-f6585571832b req-27e59f5a-faab-4484-a4fe-e9992961884d service nova] [instance: 80874308-ea97-470f-b592-2c218f708ac6] Updated VIF entry in instance network info cache for port e551f013-8b8f-4f0b-a55d-53b59513d3d3. {{(pid=107505) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} Aug 30 14:22:51.340267 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-7ff421fb-bd3d-4025-b0ee-f6585571832b req-27e59f5a-faab-4484-a4fe-e9992961884d service nova] [instance: 80874308-ea97-470f-b592-2c218f708ac6] Updating instance_info_cache with network_info: [{"id": "e551f013-8b8f-4f0b-a55d-53b59513d3d3", "address": "fa:16:3e:79:b1:79", "network": {"id": "6bc6f19c-22ac-401a-a1d9-b1741c6fc4b7", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1173843448-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "713b193a8389489d9ecbb1114d3cddb1", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape551f013-8b", "ovs_interfaceid": "e551f013-8b8f-4f0b-a55d-53b59513d3d3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:22:51.364464 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-7ff421fb-bd3d-4025-b0ee-f6585571832b req-27e59f5a-faab-4484-a4fe-e9992961884d service nova] Releasing lock "refresh_cache-80874308-ea97-470f-b592-2c218f708ac6" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:22:52.845720 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:22:53.909524 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:22:57.847560 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:23:02.849160 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:23:07.850482 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:23:09.015081 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:23:12.852624 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:23:17.855174 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:23:18.025797 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager.update_available_resource {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:23:18.055679 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:23:18.056101 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:23:18.056433 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:23:18.056845 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Auditing locally available compute resources for np0035104604 (node: np0035104604) {{(pid=107505) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} Aug 30 14:23:18.057323 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:23:18.832634 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.775s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:23:18.924011 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000015 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:23:18.924601 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000015 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:23:18.929269 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000012 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:23:18.929515 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000012 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:23:18.997094 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Aug 30 14:23:18.998463 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Hypervisor/Node resource view: name=np0035104604 free_ram=1255MB free_disk=29.835182189941406GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "00b8", "vendor_id": "1013", "numa_node": null, "label": "label_1013_00b8", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] {{(pid=107505) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} Aug 30 14:23:18.998748 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:23:18.999222 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:23:19.234529 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance 5db380b6-b80a-4cb1-b65c-571bfe9b4341 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:23:19.234856 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:23:19.235072 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance 977b8e8e-7768-4786-9a5a-78006565b459 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:23:19.235317 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance 6d52223d-efe0-461c-b964-611dc2f3103e actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:23:19.235593 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance d08ac2a0-d12d-46e9-bf89-490056df3cb5 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:23:19.235836 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance 80874308-ea97-470f-b592-2c218f708ac6 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:23:19.236175 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Total usable vcpus: 8, total allocated vcpus: 6 {{(pid=107505) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} Aug 30 14:23:19.236433 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Final resource view: name=np0035104604 phys_ram=7746MB used_ram=1280MB phys_disk=29GB used_disk=6GB total_vcpus=8 used_vcpus=6 pci_stats=[] {{(pid=107505) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} Aug 30 14:23:20.225587 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:23:20.987427 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.761s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:23:20.995103 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:23:21.013047 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:23:21.051900 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Compute_service record updated for np0035104604:np0035104604 {{(pid=107505) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} Aug 30 14:23:21.052380 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.053s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:23:22.858063 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:23:24.053968 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:23:24.055089 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Starting heal instance info cache {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9842}} Aug 30 14:23:24.055688 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Rebuilding the list of instances to heal {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9846}} Aug 30 14:23:24.088510 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: 5db380b6-b80a-4cb1-b65c-571bfe9b4341] Skipping network cache update for instance because it is Building. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9855}} Aug 30 14:23:24.089372 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75] Skipping network cache update for instance because it is Building. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9855}} Aug 30 14:23:24.089961 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: 977b8e8e-7768-4786-9a5a-78006565b459] Skipping network cache update for instance because it is Building. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9855}} Aug 30 14:23:24.090562 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: d08ac2a0-d12d-46e9-bf89-490056df3cb5] Skipping network cache update for instance because it is Building. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9855}} Aug 30 14:23:24.091086 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: 80874308-ea97-470f-b592-2c218f708ac6] Skipping network cache update for instance because it is Building. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9855}} Aug 30 14:23:24.296317 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "refresh_cache-6d52223d-efe0-461c-b964-611dc2f3103e" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:23:24.299939 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquired lock "refresh_cache-6d52223d-efe0-461c-b964-611dc2f3103e" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:23:24.299939 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Forcefully refreshing network info cache for instance {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1996}} Aug 30 14:23:24.299939 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lazy-loading 'info_cache' on Instance uuid 6d52223d-efe0-461c-b964-611dc2f3103e {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:23:25.536941 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Updating instance_info_cache with network_info: [{"id": "f91dc5b6-2970-4e7a-968b-634eae621c68", "address": "fa:16:3e:18:0f:bf", "network": {"id": "9d601ad3-78bb-4369-a497-07df6036222a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-891047702-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "fb6a45f33bd94db2adee5047c736ad16", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf91dc5b6-29", "ovs_interfaceid": "f91dc5b6-2970-4e7a-968b-634eae621c68", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:23:25.550529 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Releasing lock "refresh_cache-6d52223d-efe0-461c-b964-611dc2f3103e" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:23:25.550803 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Updated the network info_cache for instance {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} Aug 30 14:23:25.551089 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:23:25.551776 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:23:25.552012 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:23:25.552236 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:23:25.552465 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=107505) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10461}} Aug 30 14:23:26.025635 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:23:26.026014 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:23:27.019064 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:23:27.861447 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:23:29.018660 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:23:32.863324 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:23:37.865447 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:23:42.867643 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:23:47.869777 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:23:52.871696 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:23:57.873271 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:24:02.876608 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:24:07.878105 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:24:12.881118 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:24:14.300939 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-f0480da6-a233-4317-bb0b-c72262e168b2 tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Acquiring lock "d08ac2a0-d12d-46e9-bf89-490056df3cb5" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:24:14.409692 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-3554eef4-de97-411c-aeb2-157e4352b38d tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Acquiring lock "6d52223d-efe0-461c-b964-611dc2f3103e" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:24:14.410052 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-3554eef4-de97-411c-aeb2-157e4352b38d tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Lock "6d52223d-efe0-461c-b964-611dc2f3103e" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:24:14.410563 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-3554eef4-de97-411c-aeb2-157e4352b38d tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Acquiring lock "6d52223d-efe0-461c-b964-611dc2f3103e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:24:14.410956 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-3554eef4-de97-411c-aeb2-157e4352b38d tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Lock "6d52223d-efe0-461c-b964-611dc2f3103e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:24:14.411567 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-3554eef4-de97-411c-aeb2-157e4352b38d tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Lock "6d52223d-efe0-461c-b964-611dc2f3103e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:24:14.416904 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-3554eef4-de97-411c-aeb2-157e4352b38d tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Terminating instance Aug 30 14:24:14.420051 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-3554eef4-de97-411c-aeb2-157e4352b38d tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Start destroying the instance on the hypervisor. {{(pid=107505) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} Aug 30 14:24:14.451797 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:24:14.459346 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:24:14.474287 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:24:14.487393 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:24:14.497841 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:24:14.505802 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:24:14.699337 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Instance destroyed successfully. Aug 30 14:24:14.700021 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-3554eef4-de97-411c-aeb2-157e4352b38d tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Lazy-loading 'resources' on Instance uuid 6d52223d-efe0-461c-b964-611dc2f3103e {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:24:14.711047 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-2214a56e-b064-4c49-b9ba-5e91ffcd67ee req-d519daad-c5cb-4546-91d7-659f02edc8b6 service nova] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Received event network-vif-unplugged-f91dc5b6-2970-4e7a-968b-634eae621c68 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:24:14.711047 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-2214a56e-b064-4c49-b9ba-5e91ffcd67ee req-d519daad-c5cb-4546-91d7-659f02edc8b6 service nova] Acquiring lock "6d52223d-efe0-461c-b964-611dc2f3103e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:24:14.711047 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-2214a56e-b064-4c49-b9ba-5e91ffcd67ee req-d519daad-c5cb-4546-91d7-659f02edc8b6 service nova] Lock "6d52223d-efe0-461c-b964-611dc2f3103e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:24:14.711047 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-2214a56e-b064-4c49-b9ba-5e91ffcd67ee req-d519daad-c5cb-4546-91d7-659f02edc8b6 service nova] Lock "6d52223d-efe0-461c-b964-611dc2f3103e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:24:14.711047 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-2214a56e-b064-4c49-b9ba-5e91ffcd67ee req-d519daad-c5cb-4546-91d7-659f02edc8b6 service nova] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] No waiting events found dispatching network-vif-unplugged-f91dc5b6-2970-4e7a-968b-634eae621c68 {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:24:14.711655 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-2214a56e-b064-4c49-b9ba-5e91ffcd67ee req-d519daad-c5cb-4546-91d7-659f02edc8b6 service nova] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Received event network-vif-unplugged-f91dc5b6-2970-4e7a-968b-634eae621c68 for instance with task_state deleting. {{(pid=107505) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10810}} Aug 30 14:24:14.714933 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-3554eef4-de97-411c-aeb2-157e4352b38d tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2023-08-30T14:20:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1187532059',display_name='tempest-tempest.common.compute-instance-1187532059-1',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-tempest-common-compute-instance-1187532059-1',id=21,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-08-30T14:21:38Z,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='fb6a45f33bd94db2adee5047c736ad16',ramdisk_id='',reservation_id='r-lddn20nw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-MultipleCreateTestJSON-1660104447',owner_user_name='tempest-MultipleCreateTestJSON-1660104447-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-08-30T14:21:38Z,user_data=None,user_id='fcbf43c19164442abf5d6e6303211228',uuid=6d52223d-efe0-461c-b964-611dc2f3103e,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f91dc5b6-2970-4e7a-968b-634eae621c68", "address": "fa:16:3e:18:0f:bf", "network": {"id": "9d601ad3-78bb-4369-a497-07df6036222a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-891047702-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "fb6a45f33bd94db2adee5047c736ad16", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf91dc5b6-29", "ovs_interfaceid": "f91dc5b6-2970-4e7a-968b-634eae621c68", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Aug 30 14:24:14.715622 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-3554eef4-de97-411c-aeb2-157e4352b38d tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Converting VIF {"id": "f91dc5b6-2970-4e7a-968b-634eae621c68", "address": "fa:16:3e:18:0f:bf", "network": {"id": "9d601ad3-78bb-4369-a497-07df6036222a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-891047702-network", "subnets": [{"cidr": "10.1.0.0/28", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.1.0.1"}}], "meta": {"injected": false, "tenant_id": "fb6a45f33bd94db2adee5047c736ad16", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf91dc5b6-29", "ovs_interfaceid": "f91dc5b6-2970-4e7a-968b-634eae621c68", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:24:14.717051 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-3554eef4-de97-411c-aeb2-157e4352b38d tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:18:0f:bf,bridge_name='br-int',has_traffic_filtering=True,id=f91dc5b6-2970-4e7a-968b-634eae621c68,network=Network(9d601ad3-78bb-4369-a497-07df6036222a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf91dc5b6-29') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:24:14.717600 np0035104604 nova-compute[107505]: DEBUG os_vif [None req-3554eef4-de97-411c-aeb2-157e4352b38d tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:18:0f:bf,bridge_name='br-int',has_traffic_filtering=True,id=f91dc5b6-2970-4e7a-968b-634eae621c68,network=Network(9d601ad3-78bb-4369-a497-07df6036222a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf91dc5b6-29') {{(pid=107505) unplug /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:109}} Aug 30 14:24:14.723394 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:24:14.723963 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf91dc5b6-29, bridge=br-int, if_exists=True) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:24:14.725510 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:24:14.728677 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:24:14.731211 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:24:14.736604 np0035104604 nova-compute[107505]: INFO os_vif [None req-3554eef4-de97-411c-aeb2-157e4352b38d tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:18:0f:bf,bridge_name='br-int',has_traffic_filtering=True,id=f91dc5b6-2970-4e7a-968b-634eae621c68,network=Network(9d601ad3-78bb-4369-a497-07df6036222a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf91dc5b6-29') Aug 30 14:24:14.833700 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:24:15.063361 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-3554eef4-de97-411c-aeb2-157e4352b38d tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Deleting instance files /opt/stack/data/nova/instances/6d52223d-efe0-461c-b964-611dc2f3103e_del Aug 30 14:24:15.064319 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-3554eef4-de97-411c-aeb2-157e4352b38d tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Deletion of /opt/stack/data/nova/instances/6d52223d-efe0-461c-b964-611dc2f3103e_del complete Aug 30 14:24:15.132275 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-3554eef4-de97-411c-aeb2-157e4352b38d tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Took 0.71 seconds to destroy the instance on the hypervisor. Aug 30 14:24:15.133103 np0035104604 nova-compute[107505]: DEBUG oslo.service.loopingcall [None req-3554eef4-de97-411c-aeb2-157e4352b38d tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=107505) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} Aug 30 14:24:15.133593 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [-] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Deallocating network for instance {{(pid=107505) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} Aug 30 14:24:15.134015 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [-] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] deallocate_for_instance() {{(pid=107505) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} Aug 30 14:24:15.777839 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [-] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Updating instance_info_cache with network_info: [] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:24:15.796526 np0035104604 nova-compute[107505]: INFO nova.compute.manager [-] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Took 0.66 seconds to deallocate network for instance. Aug 30 14:24:15.867868 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-3554eef4-de97-411c-aeb2-157e4352b38d tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:24:15.868469 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-3554eef4-de97-411c-aeb2-157e4352b38d tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:24:16.824116 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-9b0fc481-b5d7-4d01-99b6-f8187ec61981 req-e19f42f4-d016-443f-9678-b7f283b9e023 service nova] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Received event network-vif-plugged-f91dc5b6-2970-4e7a-968b-634eae621c68 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:24:16.824945 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-9b0fc481-b5d7-4d01-99b6-f8187ec61981 req-e19f42f4-d016-443f-9678-b7f283b9e023 service nova] Acquiring lock "6d52223d-efe0-461c-b964-611dc2f3103e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:24:16.824945 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-9b0fc481-b5d7-4d01-99b6-f8187ec61981 req-e19f42f4-d016-443f-9678-b7f283b9e023 service nova] Lock "6d52223d-efe0-461c-b964-611dc2f3103e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:24:16.824945 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-9b0fc481-b5d7-4d01-99b6-f8187ec61981 req-e19f42f4-d016-443f-9678-b7f283b9e023 service nova] Lock "6d52223d-efe0-461c-b964-611dc2f3103e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:24:16.825354 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-9b0fc481-b5d7-4d01-99b6-f8187ec61981 req-e19f42f4-d016-443f-9678-b7f283b9e023 service nova] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] No waiting events found dispatching network-vif-plugged-f91dc5b6-2970-4e7a-968b-634eae621c68 {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:24:16.825646 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [req-9b0fc481-b5d7-4d01-99b6-f8187ec61981 req-e19f42f4-d016-443f-9678-b7f283b9e023 service nova] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Received unexpected event network-vif-plugged-f91dc5b6-2970-4e7a-968b-634eae621c68 for instance with vm_state deleted and task_state None. Aug 30 14:24:16.825828 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-9b0fc481-b5d7-4d01-99b6-f8187ec61981 req-e19f42f4-d016-443f-9678-b7f283b9e023 service nova] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Received event network-vif-deleted-f91dc5b6-2970-4e7a-968b-634eae621c68 {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:24:16.846108 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-3554eef4-de97-411c-aeb2-157e4352b38d tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:24:17.410696 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-3554eef4-de97-411c-aeb2-157e4352b38d tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.564s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:24:17.418537 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-3554eef4-de97-411c-aeb2-157e4352b38d tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:24:17.434891 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-3554eef4-de97-411c-aeb2-157e4352b38d tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:24:17.469707 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-3554eef4-de97-411c-aeb2-157e4352b38d tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.601s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:24:17.621233 np0035104604 nova-compute[107505]: INFO nova.scheduler.client.report [None req-3554eef4-de97-411c-aeb2-157e4352b38d tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Deleted allocations for instance 6d52223d-efe0-461c-b964-611dc2f3103e Aug 30 14:24:17.722305 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-3554eef4-de97-411c-aeb2-157e4352b38d tempest-MultipleCreateTestJSON-1660104447 tempest-MultipleCreateTestJSON-1660104447-project-member] Lock "6d52223d-efe0-461c-b964-611dc2f3103e" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 3.312s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:24:18.025649 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager.update_available_resource {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:24:18.050346 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:24:18.050951 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:24:18.051395 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:24:18.051833 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Auditing locally available compute resources for np0035104604 (node: np0035104604) {{(pid=107505) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} Aug 30 14:24:18.052318 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:24:18.641116 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.585s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:24:18.715016 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000012 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:24:18.715290 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000012 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:24:18.784445 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Aug 30 14:24:18.786536 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Hypervisor/Node resource view: name=np0035104604 free_ram=1749MB free_disk=29.83517837524414GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "00b8", "vendor_id": "1013", "numa_node": null, "label": "label_1013_00b8", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] {{(pid=107505) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} Aug 30 14:24:18.786952 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:24:18.787465 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:24:18.990088 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance 5db380b6-b80a-4cb1-b65c-571bfe9b4341 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:24:18.990580 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:24:18.990959 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance 977b8e8e-7768-4786-9a5a-78006565b459 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:24:18.991326 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance d08ac2a0-d12d-46e9-bf89-490056df3cb5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:24:18.991689 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance 80874308-ea97-470f-b592-2c218f708ac6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:24:18.992248 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Total usable vcpus: 8, total allocated vcpus: 5 {{(pid=107505) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} Aug 30 14:24:18.992612 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Final resource view: name=np0035104604 phys_ram=7746MB used_ram=1152MB phys_disk=29GB used_disk=5GB total_vcpus=8 used_vcpus=5 pci_stats=[] {{(pid=107505) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} Aug 30 14:24:19.830297 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:24:19.853603 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:24:20.460566 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.606s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:24:20.468206 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:24:20.485976 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:24:20.523604 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Compute_service record updated for np0035104604:np0035104604 {{(pid=107505) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} Aug 30 14:24:20.524071 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.736s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:24:23.525474 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:24:24.025368 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:24:24.025840 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Starting heal instance info cache {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9842}} Aug 30 14:24:24.026085 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Rebuilding the list of instances to heal {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9846}} Aug 30 14:24:24.057485 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: 5db380b6-b80a-4cb1-b65c-571bfe9b4341] Skipping network cache update for instance because it is Building. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9855}} Aug 30 14:24:24.057823 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75] Skipping network cache update for instance because it is Building. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9855}} Aug 30 14:24:24.058168 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: 977b8e8e-7768-4786-9a5a-78006565b459] Skipping network cache update for instance because it is Building. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9855}} Aug 30 14:24:24.058492 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: d08ac2a0-d12d-46e9-bf89-490056df3cb5] Skipping network cache update for instance because it is Building. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9855}} Aug 30 14:24:24.058799 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: 80874308-ea97-470f-b592-2c218f708ac6] Skipping network cache update for instance because it is Building. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9855}} Aug 30 14:24:24.059076 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Didn't find any instances for network info cache update. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9928}} Aug 30 14:24:24.059920 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:24:24.060197 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=107505) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10461}} Aug 30 14:24:24.832332 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:24:26.025958 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:24:26.026696 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:24:26.026784 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:24:27.025543 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:24:28.018428 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:24:29.699963 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:24:29.700978 np0035104604 nova-compute[107505]: INFO nova.compute.manager [-] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] VM Stopped (Lifecycle Event) Aug 30 14:24:29.724193 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-5532d722-ba34-4142-812d-300293c38476 None None] [instance: 6d52223d-efe0-461c-b964-611dc2f3103e] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:24:29.833982 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:24:34.835864 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:24:39.837299 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:24:44.839002 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:24:49.840754 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:24:50.770326 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-ca93f986-f4b6-4273-a29f-b9acc718c827 tempest-HypervisorAdminTestJSON-1592904260 tempest-HypervisorAdminTestJSON-1592904260-project-admin] Running cmd (subprocess): env LANG=C uptime {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:24:50.792396 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-ca93f986-f4b6-4273-a29f-b9acc718c827 tempest-HypervisorAdminTestJSON-1592904260 tempest-HypervisorAdminTestJSON-1592904260-project-admin] CMD "env LANG=C uptime" returned: 0 in 0.021s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:24:54.842059 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:24:59.844466 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:25:04.845577 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:25:09.847186 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:25:14.849010 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:25:14.849753 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:25:14.850059 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=107505) run /opt/stack/data/venv/lib/python3.10/site-packages/ovs/reconnect.py:117}} Aug 30 14:25:14.850233 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=107505) _transition /opt/stack/data/venv/lib/python3.10/site-packages/ovs/reconnect.py:519}} Aug 30 14:25:14.850818 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=107505) _transition /opt/stack/data/venv/lib/python3.10/site-packages/ovs/reconnect.py:519}} Aug 30 14:25:14.851254 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:25:18.026419 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager.update_available_resource {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:25:18.056130 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:25:18.056578 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:25:18.057104 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:25:18.057527 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Auditing locally available compute resources for np0035104604 (node: np0035104604) {{(pid=107505) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} Aug 30 14:25:18.058015 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:25:19.854235 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:25:21.638425 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 3.580s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:25:21.696182 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000012 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:25:21.696350 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] skipping disk for instance-00000012 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:25:21.746300 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Aug 30 14:25:21.747629 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Hypervisor/Node resource view: name=np0035104604 free_ram=1738MB free_disk=29.880977630615234GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "00b8", "vendor_id": "1013", "numa_node": null, "label": "label_1013_00b8", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] {{(pid=107505) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} Aug 30 14:25:21.747942 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:25:21.748358 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:25:21.956981 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance 5db380b6-b80a-4cb1-b65c-571bfe9b4341 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:25:21.957405 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:25:21.957835 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance 977b8e8e-7768-4786-9a5a-78006565b459 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:25:21.958161 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance d08ac2a0-d12d-46e9-bf89-490056df3cb5 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:25:21.958512 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Instance 80874308-ea97-470f-b592-2c218f708ac6 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. {{(pid=107505) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} Aug 30 14:25:21.958921 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Total usable vcpus: 8, total allocated vcpus: 5 {{(pid=107505) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} Aug 30 14:25:21.959274 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Final resource view: name=np0035104604 phys_ram=7746MB used_ram=1152MB phys_disk=29GB used_disk=5GB total_vcpus=8 used_vcpus=5 pci_stats=[] {{(pid=107505) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} Aug 30 14:25:22.855706 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:25:23.468537 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.613s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:25:23.474426 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:25:23.489006 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:25:23.492736 np0035104604 nova-compute[107505]: DEBUG nova.compute.resource_tracker [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Compute_service record updated for np0035104604:np0035104604 {{(pid=107505) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} Aug 30 14:25:23.493034 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.745s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:25:24.856636 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:25:24.952093 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Acquiring lock "231af25a-406a-40ff-bc7d-605b55a0cc97" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:25:24.952406 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Lock "231af25a-406a-40ff-bc7d-605b55a0cc97" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:25:24.970909 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Starting instance... {{(pid=107505) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} Aug 30 14:25:25.204701 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:25:25.205068 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:25:25.209920 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=107505) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Aug 30 14:25:25.210281 np0035104604 nova-compute[107505]: INFO nova.compute.claims [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Claim successful on node np0035104604 Aug 30 14:25:26.320445 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:25:27.047051 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.726s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:25:27.054043 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:25:27.071186 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:25:27.112365 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.907s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:25:27.113202 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Start building networks asynchronously for instance. {{(pid=107505) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} Aug 30 14:25:27.176495 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Allocating IP information in the background. {{(pid=107505) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} Aug 30 14:25:27.176903 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] allocate_for_instance() {{(pid=107505) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} Aug 30 14:25:27.306668 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Aug 30 14:25:27.329897 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Start building block device mappings for instance. {{(pid=107505) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} Aug 30 14:25:27.493550 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:25:27.494018 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Starting heal instance info cache {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9842}} Aug 30 14:25:27.494456 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Rebuilding the list of instances to heal {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9846}} Aug 30 14:25:27.524561 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: 5db380b6-b80a-4cb1-b65c-571bfe9b4341] Skipping network cache update for instance because it is Building. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9855}} Aug 30 14:25:27.525101 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: f4db13f4-ab0a-4bbb-a585-1ea2cdf93e75] Skipping network cache update for instance because it is Building. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9855}} Aug 30 14:25:27.525620 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: 977b8e8e-7768-4786-9a5a-78006565b459] Skipping network cache update for instance because it is Building. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9855}} Aug 30 14:25:27.526025 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: d08ac2a0-d12d-46e9-bf89-490056df3cb5] Skipping network cache update for instance because it is Building. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9855}} Aug 30 14:25:27.526468 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: 80874308-ea97-470f-b592-2c218f708ac6] Skipping network cache update for instance because it is Building. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9855}} Aug 30 14:25:27.527379 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Skipping network cache update for instance because it is Building. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9855}} Aug 30 14:25:27.527742 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Didn't find any instances for network info cache update. {{(pid=107505) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9928}} Aug 30 14:25:27.529042 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:25:27.530327 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Start spawning the instance on the hypervisor. {{(pid=107505) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} Aug 30 14:25:27.531660 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Creating instance directory {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4717}} Aug 30 14:25:27.532326 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Creating image(s) Aug 30 14:25:27.564491 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] rbd image 231af25a-406a-40ff-bc7d-605b55a0cc97_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:25:27.591921 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] rbd image 231af25a-406a-40ff-bc7d-605b55a0cc97_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:25:27.619733 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] rbd image 231af25a-406a-40ff-bc7d-605b55a0cc97_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:25:27.623378 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Running cmd (subprocess): /opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e --force-share --output=json {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:25:27.647978 np0035104604 nova-compute[107505]: DEBUG nova.policy [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '920a55a4ab704ae183b4770500f4b96c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7d2c92a904ff440e852cd97f87d075cd', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=107505) authorize /opt/stack/nova/nova/policy.py:203}} Aug 30 14:25:27.652637 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:25:27.779846 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:25:27.779846 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:25:27.779846 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=107505) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10461}} Aug 30 14:25:27.780372 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] CMD "/opt/stack/data/venv/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e --force-share --output=json" returned: 0 in 0.157s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:25:27.781133 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Acquiring lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:25:27.781965 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:25:27.782379 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Lock "d9a6e3c5d51a1e374e75caad4bc1a3059519a24e" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:25:27.808279 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] rbd image 231af25a-406a-40ff-bc7d-605b55a0cc97_disk does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:25:27.812189 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e 231af25a-406a-40ff-bc7d-605b55a0cc97_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:25:28.162695 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] CMD "rbd import --pool vms /opt/stack/data/nova/instances/_base/d9a6e3c5d51a1e374e75caad4bc1a3059519a24e 231af25a-406a-40ff-bc7d-605b55a0cc97_disk --image-format=2 --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.350s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:25:28.242754 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] resizing rbd image 231af25a-406a-40ff-bc7d-605b55a0cc97_disk to 1073741824 {{(pid=107505) resize /opt/stack/nova/nova/storage/rbd_utils.py:288}} Aug 30 14:25:28.284599 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:25:28.348001 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Created local disks {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4851}} Aug 30 14:25:28.348611 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Ensure instance console log exists: /opt/stack/data/nova/instances/231af25a-406a-40ff-bc7d-605b55a0cc97/console.log {{(pid=107505) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4603}} Aug 30 14:25:28.348813 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:25:28.349200 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:25:28.349524 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:25:28.614880 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Successfully created port: f0f50d17-b2cf-4731-b750-b2729686345f {{(pid=107505) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} Aug 30 14:25:29.017614 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:25:29.024694 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:25:29.113815 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:25:29.427278 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Successfully updated port: f0f50d17-b2cf-4731-b750-b2729686345f {{(pid=107505) _update_port /opt/stack/nova/nova/network/neutron.py:586}} Aug 30 14:25:29.449035 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Acquiring lock "refresh_cache-231af25a-406a-40ff-bc7d-605b55a0cc97" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:25:29.449266 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Acquired lock "refresh_cache-231af25a-406a-40ff-bc7d-605b55a0cc97" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:25:29.449582 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Building network info cache for instance {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} Aug 30 14:25:29.615586 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-0badc5cf-83c6-4d20-bc91-aff5635ad4ae req-fd1be2d2-60fd-4446-ab3b-b3c51e952210 service nova] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Received event network-changed-f0f50d17-b2cf-4731-b750-b2729686345f {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:25:29.616029 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-0badc5cf-83c6-4d20-bc91-aff5635ad4ae req-fd1be2d2-60fd-4446-ab3b-b3c51e952210 service nova] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Refreshing instance network info cache due to event network-changed-f0f50d17-b2cf-4731-b750-b2729686345f. {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11037}} Aug 30 14:25:29.616524 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-0badc5cf-83c6-4d20-bc91-aff5635ad4ae req-fd1be2d2-60fd-4446-ab3b-b3c51e952210 service nova] Acquiring lock "refresh_cache-231af25a-406a-40ff-bc7d-605b55a0cc97" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:25:29.681023 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Instance cache missing network info. {{(pid=107505) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} Aug 30 14:25:29.956818 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:25:30.125990 np0035104604 nova-compute[107505]: DEBUG oslo_service.periodic_task [None req-36ab8a54-e904-49a4-b6da-d1ce43739b33 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=107505) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} Aug 30 14:25:30.514221 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Updating instance_info_cache with network_info: [{"id": "f0f50d17-b2cf-4731-b750-b2729686345f", "address": "fa:16:3e:17:a1:3d", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.72", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0f50d17-b2", "ovs_interfaceid": "f0f50d17-b2cf-4731-b750-b2729686345f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:25:30.534577 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Releasing lock "refresh_cache-231af25a-406a-40ff-bc7d-605b55a0cc97" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:25:30.535079 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Instance network_info: |[{"id": "f0f50d17-b2cf-4731-b750-b2729686345f", "address": "fa:16:3e:17:a1:3d", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.72", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0f50d17-b2", "ovs_interfaceid": "f0f50d17-b2cf-4731-b750-b2729686345f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=107505) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} Aug 30 14:25:30.535587 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-0badc5cf-83c6-4d20-bc91-aff5635ad4ae req-fd1be2d2-60fd-4446-ab3b-b3c51e952210 service nova] Acquired lock "refresh_cache-231af25a-406a-40ff-bc7d-605b55a0cc97" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:25:30.535919 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-0badc5cf-83c6-4d20-bc91-aff5635ad4ae req-fd1be2d2-60fd-4446-ab3b-b3c51e952210 service nova] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Refreshing network info cache for port f0f50d17-b2cf-4731-b750-b2729686345f {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} Aug 30 14:25:30.540640 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Start _get_guest_xml network_info=[{"id": "f0f50d17-b2cf-4731-b750-b2729686345f", "address": "fa:16:3e:17:a1:3d", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.72", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0f50d17-b2", "ovs_interfaceid": "f0f50d17-b2cf-4731-b750-b2729686345f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'ide', 'dev': 'hda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-08-30T14:01:32Z,direct_url=,disk_format='qcow2',id=a32d892d-d3ba-4b7e-9bab-7e06f730b9e0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='7f1ed4be4a61468fa9c662d2b5fbcb56',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-08-30T14:01:43Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'disk_bus': 'virtio', 'image_id': 'a32d892d-d3ba-4b7e-9bab-7e06f730b9e0'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=107505) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7553}} Aug 30 14:25:30.545762 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Aug 30 14:25:30.670610 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Searching host: 'np0035104604' for CPU controller through CGroups V1... {{(pid=107505) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1650}} Aug 30 14:25:30.671284 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] CPU controller missing on host. {{(pid=107505) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1660}} Aug 30 14:25:30.676280 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Searching host: 'np0035104604' for CPU controller through CGroups V2... {{(pid=107505) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1669}} Aug 30 14:25:30.676674 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] CPU controller found on host. {{(pid=107505) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1676}} Aug 30 14:25:30.678212 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=107505) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5390}} Aug 30 14:25:30.678841 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-08-30T14:02:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-08-30T14:01:32Z,direct_url=,disk_format='qcow2',id=a32d892d-d3ba-4b7e-9bab-7e06f730b9e0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='7f1ed4be4a61468fa9c662d2b5fbcb56',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-08-30T14:01:43Z,virtual_size=,visibility=), allow threads: True {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Aug 30 14:25:30.679155 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Flavor limits 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Aug 30 14:25:30.679374 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Image limits 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Aug 30 14:25:30.679767 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Flavor pref 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Aug 30 14:25:30.680000 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Image pref 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Aug 30 14:25:30.680357 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Aug 30 14:25:30.680743 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Aug 30 14:25:30.681042 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=107505) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Aug 30 14:25:30.681395 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Got 1 possible topologies {{(pid=107505) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Aug 30 14:25:30.681757 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Aug 30 14:25:30.682075 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Aug 30 14:25:30.697103 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:25:31.570015 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.873s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:25:31.603132 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] rbd image 231af25a-406a-40ff-bc7d-605b55a0cc97_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:25:31.609993 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:25:31.937995 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-0badc5cf-83c6-4d20-bc91-aff5635ad4ae req-fd1be2d2-60fd-4446-ab3b-b3c51e952210 service nova] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Updated VIF entry in instance network info cache for port f0f50d17-b2cf-4731-b750-b2729686345f. {{(pid=107505) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} Aug 30 14:25:31.938989 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [req-0badc5cf-83c6-4d20-bc91-aff5635ad4ae req-fd1be2d2-60fd-4446-ab3b-b3c51e952210 service nova] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Updating instance_info_cache with network_info: [{"id": "f0f50d17-b2cf-4731-b750-b2729686345f", "address": "fa:16:3e:17:a1:3d", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.72", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0f50d17-b2", "ovs_interfaceid": "f0f50d17-b2cf-4731-b750-b2729686345f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:25:31.956746 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-0badc5cf-83c6-4d20-bc91-aff5635ad4ae req-fd1be2d2-60fd-4446-ab3b-b3c51e952210 service nova] Releasing lock "refresh_cache-231af25a-406a-40ff-bc7d-605b55a0cc97" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:25:32.285098 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.675s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:25:32.286653 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-08-30T14:25:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-MigrationsAdminTest-server-189428375',display_name='tempest-MigrationsAdminTest-server-189428375',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-migrationsadmintest-server-189428375',id=24,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7d2c92a904ff440e852cd97f87d075cd',ramdisk_id='',reservation_id='r-2nm7z0oy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MigrationsAdminTest-1109016108',owner_user_name='tempest-MigrationsAdminTest-1109016108-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-08-30T14:25:27Z,user_data=None,user_id='920a55a4ab704ae183b4770500f4b96c',uuid=231af25a-406a-40ff-bc7d-605b55a0cc97,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f0f50d17-b2cf-4731-b750-b2729686345f", "address": "fa:16:3e:17:a1:3d", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.72", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0f50d17-b2", "ovs_interfaceid": "f0f50d17-b2cf-4731-b750-b2729686345f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=qemu {{(pid=107505) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Aug 30 14:25:32.287068 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Converting VIF {"id": "f0f50d17-b2cf-4731-b750-b2729686345f", "address": "fa:16:3e:17:a1:3d", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.72", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0f50d17-b2", "ovs_interfaceid": "f0f50d17-b2cf-4731-b750-b2729686345f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:25:32.287986 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:17:a1:3d,bridge_name='br-int',has_traffic_filtering=True,id=f0f50d17-b2cf-4731-b750-b2729686345f,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0f50d17-b2') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:25:32.289159 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Lazy-loading 'pci_devices' on Instance uuid 231af25a-406a-40ff-bc7d-605b55a0cc97 {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] End _get_guest_xml xml= Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: 231af25a-406a-40ff-bc7d-605b55a0cc97 Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: instance-00000018 Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: 131072 Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: 1 Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: tempest-MigrationsAdminTest-server-189428375 Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: 2023-08-30 14:25:30 Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: 128 Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: 1 Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: 0 Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: 0 Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: 1 Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: tempest-MigrationsAdminTest-1109016108-project-member Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: tempest-MigrationsAdminTest-1109016108 Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: OpenStack Foundation Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: OpenStack Nova Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: 27.1.0 Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: 231af25a-406a-40ff-bc7d-605b55a0cc97 Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: 231af25a-406a-40ff-bc7d-605b55a0cc97 Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Virtual Machine Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: hvm Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Nehalem Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: /dev/urandom Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: Aug 30 14:25:32.301544 np0035104604 nova-compute[107505]: {{(pid=107505) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7559}} Aug 30 14:25:32.307401 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Preparing to wait for external event network-vif-plugged-f0f50d17-b2cf-4731-b750-b2729686345f {{(pid=107505) prepare_for_instance_event /opt/stack/nova/nova/compute/manager.py:283}} Aug 30 14:25:32.307401 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Acquiring lock "231af25a-406a-40ff-bc7d-605b55a0cc97-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:25:32.307401 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Lock "231af25a-406a-40ff-bc7d-605b55a0cc97-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:25:32.307401 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Lock "231af25a-406a-40ff-bc7d-605b55a0cc97-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:25:32.307401 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-08-30T14:25:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-MigrationsAdminTest-server-189428375',display_name='tempest-MigrationsAdminTest-server-189428375',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-migrationsadmintest-server-189428375',id=24,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7d2c92a904ff440e852cd97f87d075cd',ramdisk_id='',reservation_id='r-2nm7z0oy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MigrationsAdminTest-1109016108',owner_user_name='tempest-MigrationsAdminTest-1109016108-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-08-30T14:25:27Z,user_data=None,user_id='920a55a4ab704ae183b4770500f4b96c',uuid=231af25a-406a-40ff-bc7d-605b55a0cc97,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f0f50d17-b2cf-4731-b750-b2729686345f", "address": "fa:16:3e:17:a1:3d", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.72", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0f50d17-b2", "ovs_interfaceid": "f0f50d17-b2cf-4731-b750-b2729686345f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Aug 30 14:25:32.307843 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Converting VIF {"id": "f0f50d17-b2cf-4731-b750-b2729686345f", "address": "fa:16:3e:17:a1:3d", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.72", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0f50d17-b2", "ovs_interfaceid": "f0f50d17-b2cf-4731-b750-b2729686345f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:25:32.307843 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:17:a1:3d,bridge_name='br-int',has_traffic_filtering=True,id=f0f50d17-b2cf-4731-b750-b2729686345f,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0f50d17-b2') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:25:32.307843 np0035104604 nova-compute[107505]: DEBUG os_vif [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:17:a1:3d,bridge_name='br-int',has_traffic_filtering=True,id=f0f50d17-b2cf-4731-b750-b2729686345f,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0f50d17-b2') {{(pid=107505) plug /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:76}} Aug 30 14:25:32.308070 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:25:32.308876 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:25:32.309334 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Aug 30 14:25:32.314409 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:25:32.315511 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf0f50d17-b2, may_exist=True, interface_attrs={}) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:25:32.316321 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf0f50d17-b2, col_values=(('external_ids', {'iface-id': 'f0f50d17-b2cf-4731-b750-b2729686345f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:17:a1:3d', 'vm-uuid': '231af25a-406a-40ff-bc7d-605b55a0cc97'}),), if_exists=True) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:25:32.318393 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:25:32.324412 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:25:32.330537 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:25:32.333912 np0035104604 nova-compute[107505]: INFO os_vif [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:17:a1:3d,bridge_name='br-int',has_traffic_filtering=True,id=f0f50d17-b2cf-4731-b750-b2729686345f,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0f50d17-b2') Aug 30 14:25:32.378832 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] No BDM found with device name vda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:25:32.379145 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] No BDM found with device name hda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:25:32.379342 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] No VIF found with MAC fa:16:3e:17:a1:3d, not building metadata {{(pid=107505) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12068}} Aug 30 14:25:32.379971 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Using config drive Aug 30 14:25:32.408327 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] rbd image 231af25a-406a-40ff-bc7d-605b55a0cc97_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:25:32.805975 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Creating config drive at /opt/stack/data/nova/instances/231af25a-406a-40ff-bc7d-605b55a0cc97/disk.config Aug 30 14:25:32.810357 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Running cmd (subprocess): genisoimage -o /opt/stack/data/nova/instances/231af25a-406a-40ff-bc7d-605b55a0cc97/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 27.1.0 -quiet -J -r -V config-2 /tmp/tmp68ittko2 {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:25:32.846446 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] CMD "genisoimage -o /opt/stack/data/nova/instances/231af25a-406a-40ff-bc7d-605b55a0cc97/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 27.1.0 -quiet -J -r -V config-2 /tmp/tmp68ittko2" returned: 0 in 0.035s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:25:32.881656 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] rbd image 231af25a-406a-40ff-bc7d-605b55a0cc97_disk.config does not exist {{(pid=107505) __init__ /opt/stack/nova/nova/storage/rbd_utils.py:80}} Aug 30 14:25:32.892167 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Running cmd (subprocess): rbd import --pool vms /opt/stack/data/nova/instances/231af25a-406a-40ff-bc7d-605b55a0cc97/disk.config 231af25a-406a-40ff-bc7d-605b55a0cc97_disk.config --image-format=2 --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:25:33.099246 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] CMD "rbd import --pool vms /opt/stack/data/nova/instances/231af25a-406a-40ff-bc7d-605b55a0cc97/disk.config 231af25a-406a-40ff-bc7d-605b55a0cc97_disk.config --image-format=2 --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.206s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:25:33.100367 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Deleting local config drive /opt/stack/data/nova/instances/231af25a-406a-40ff-bc7d-605b55a0cc97/disk.config because it was imported into RBD. Aug 30 14:25:33.133435 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:25:33.173893 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:25:33.176174 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:25:33.178193 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:25:33.391141 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-5157af2e-e674-4f70-9553-b024e74ae028 req-921585dd-12e7-4dc8-a36a-ccee1e1fcdd2 service nova] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Received event network-vif-plugged-f0f50d17-b2cf-4731-b750-b2729686345f {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:25:33.391544 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-5157af2e-e674-4f70-9553-b024e74ae028 req-921585dd-12e7-4dc8-a36a-ccee1e1fcdd2 service nova] Acquiring lock "231af25a-406a-40ff-bc7d-605b55a0cc97-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:25:33.391964 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-5157af2e-e674-4f70-9553-b024e74ae028 req-921585dd-12e7-4dc8-a36a-ccee1e1fcdd2 service nova] Lock "231af25a-406a-40ff-bc7d-605b55a0cc97-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:25:33.392273 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-5157af2e-e674-4f70-9553-b024e74ae028 req-921585dd-12e7-4dc8-a36a-ccee1e1fcdd2 service nova] Lock "231af25a-406a-40ff-bc7d-605b55a0cc97-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:25:33.392662 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-5157af2e-e674-4f70-9553-b024e74ae028 req-921585dd-12e7-4dc8-a36a-ccee1e1fcdd2 service nova] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Processing event network-vif-plugged-f0f50d17-b2cf-4731-b750-b2729686345f {{(pid=107505) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10792}} Aug 30 14:25:33.522306 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:25:33.530185 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:25:33.536539 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:25:33.548526 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:25:33.553746 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:25:33.931156 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:25:33.978260 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Started> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:25:33.978559 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] VM Started (Lifecycle Event) Aug 30 14:25:33.990554 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Instance event wait completed in 0 seconds for network-vif-plugged {{(pid=107505) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Aug 30 14:25:34.005468 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:25:34.007932 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Guest created on hypervisor {{(pid=107505) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4411}} Aug 30 14:25:34.012737 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:25:34.017525 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Instance spawned successfully. Aug 30 14:25:34.018258 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:894}} Aug 30 14:25:34.039496 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] During sync_power_state the instance has a pending task (spawning). Skip. Aug 30 14:25:34.040339 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Paused> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:25:34.040563 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] VM Paused (Lifecycle Event) Aug 30 14:25:34.048643 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Found default for hw_cdrom_bus of ide {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:25:34.048882 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Found default for hw_disk_bus of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:25:34.049491 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Found default for hw_input_bus of None {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:25:34.050081 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Found default for hw_pointer_model of None {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:25:34.050808 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Found default for hw_video_model of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:25:34.051521 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Found default for hw_vif_model of virtio {{(pid=107505) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Aug 30 14:25:34.059686 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:25:34.066111 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Resumed> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:25:34.066345 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] VM Resumed (Lifecycle Event) Aug 30 14:25:34.086247 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:25:34.092158 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:25:34.132553 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] During sync_power_state the instance has a pending task (spawning). Skip. Aug 30 14:25:34.152395 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Took 6.62 seconds to spawn the instance on the hypervisor. Aug 30 14:25:34.153059 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:25:34.243343 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Took 9.19 seconds to build instance. Aug 30 14:25:34.264979 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-8bae6357-d465-49ab-95dc-9d7985524b5f tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Lock "231af25a-406a-40ff-bc7d-605b55a0cc97" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 9.312s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:25:35.462457 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-47f94f5d-73f5-44c7-9a8c-8fecc8f66f55 req-c43dcedd-5529-48fc-a254-ff1f0cefeea4 service nova] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Received event network-vif-plugged-f0f50d17-b2cf-4731-b750-b2729686345f {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:25:35.462916 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-47f94f5d-73f5-44c7-9a8c-8fecc8f66f55 req-c43dcedd-5529-48fc-a254-ff1f0cefeea4 service nova] Acquiring lock "231af25a-406a-40ff-bc7d-605b55a0cc97-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:25:35.463367 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-47f94f5d-73f5-44c7-9a8c-8fecc8f66f55 req-c43dcedd-5529-48fc-a254-ff1f0cefeea4 service nova] Lock "231af25a-406a-40ff-bc7d-605b55a0cc97-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:25:35.463758 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-47f94f5d-73f5-44c7-9a8c-8fecc8f66f55 req-c43dcedd-5529-48fc-a254-ff1f0cefeea4 service nova] Lock "231af25a-406a-40ff-bc7d-605b55a0cc97-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:25:35.464177 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-47f94f5d-73f5-44c7-9a8c-8fecc8f66f55 req-c43dcedd-5529-48fc-a254-ff1f0cefeea4 service nova] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] No waiting events found dispatching network-vif-plugged-f0f50d17-b2cf-4731-b750-b2729686345f {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:25:35.464592 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [req-47f94f5d-73f5-44c7-9a8c-8fecc8f66f55 req-c43dcedd-5529-48fc-a254-ff1f0cefeea4 service nova] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Received unexpected event network-vif-plugged-f0f50d17-b2cf-4731-b750-b2729686345f for instance with vm_state active and task_state None. Aug 30 14:25:37.318505 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:25:38.610281 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Stashing vm_state: active {{(pid=107505) _prep_resize /opt/stack/nova/nova/compute/manager.py:5535}} Aug 30 14:25:38.851719 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:25:38.852274 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:25:38.892738 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=107505) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Aug 30 14:25:38.893026 np0035104604 nova-compute[107505]: INFO nova.compute.claims [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Claim successful on node np0035104604 Aug 30 14:25:38.932672 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:25:38.952837 np0035104604 nova-compute[107505]: INFO nova.compute.resource_tracker [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Updating resource usage from migration 379d1aeb-e026-42e2-82b3-0ed1adfe38c8 Aug 30 14:25:40.112727 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:25:40.766228 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.653s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:25:40.773128 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:25:40.788933 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:25:40.834602 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 1.982s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:25:40.835091 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Migrating Aug 30 14:25:40.835715 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Acquiring lock "compute-rpcapi-router" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:25:40.836113 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Acquired lock "compute-rpcapi-router" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:25:40.843238 np0035104604 nova-compute[107505]: INFO nova.compute.rpcapi [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Automatically selected compute RPC version 6.2 from minimum service version 66 Aug 30 14:25:40.845009 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Releasing lock "compute-rpcapi-router" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:25:40.891610 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Acquiring lock "refresh_cache-231af25a-406a-40ff-bc7d-605b55a0cc97" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:25:40.891959 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Acquired lock "refresh_cache-231af25a-406a-40ff-bc7d-605b55a0cc97" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:25:40.892621 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Building network info cache for instance {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} Aug 30 14:25:42.318858 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Updating instance_info_cache with network_info: [{"id": "f0f50d17-b2cf-4731-b750-b2729686345f", "address": "fa:16:3e:17:a1:3d", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.72", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0f50d17-b2", "ovs_interfaceid": "f0f50d17-b2cf-4731-b750-b2729686345f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:25:42.321104 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:25:42.338912 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Releasing lock "refresh_cache-231af25a-406a-40ff-bc7d-605b55a0cc97" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:25:42.484599 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Starting migrate_disk_and_power_off {{(pid=107505) migrate_disk_and_power_off /opt/stack/nova/nova/virt/libvirt/driver.py:11487}} Aug 30 14:25:42.512927 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:25:42.526082 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:25:42.537463 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:25:42.549495 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:25:42.568527 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:25:42.693679 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-f2ab0fc2-7857-4eed-994f-f46913729fcc req-b0be8572-2d5e-4f41-8d49-7c4d007c45e3 service nova] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Received event network-vif-unplugged-f0f50d17-b2cf-4731-b750-b2729686345f {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:25:42.695079 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-f2ab0fc2-7857-4eed-994f-f46913729fcc req-b0be8572-2d5e-4f41-8d49-7c4d007c45e3 service nova] Acquiring lock "231af25a-406a-40ff-bc7d-605b55a0cc97-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:25:42.695079 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-f2ab0fc2-7857-4eed-994f-f46913729fcc req-b0be8572-2d5e-4f41-8d49-7c4d007c45e3 service nova] Lock "231af25a-406a-40ff-bc7d-605b55a0cc97-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:25:42.695079 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-f2ab0fc2-7857-4eed-994f-f46913729fcc req-b0be8572-2d5e-4f41-8d49-7c4d007c45e3 service nova] Lock "231af25a-406a-40ff-bc7d-605b55a0cc97-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:25:42.695079 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-f2ab0fc2-7857-4eed-994f-f46913729fcc req-b0be8572-2d5e-4f41-8d49-7c4d007c45e3 service nova] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] No waiting events found dispatching network-vif-unplugged-f0f50d17-b2cf-4731-b750-b2729686345f {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:25:42.695321 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [req-f2ab0fc2-7857-4eed-994f-f46913729fcc req-b0be8572-2d5e-4f41-8d49-7c4d007c45e3 service nova] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Received unexpected event network-vif-unplugged-f0f50d17-b2cf-4731-b750-b2729686345f for instance with vm_state active and task_state resize_migrating. Aug 30 14:25:42.731562 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Instance destroyed successfully. Aug 30 14:25:42.732950 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2023-08-30T14:25:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-MigrationsAdminTest-server-189428375',display_name='tempest-MigrationsAdminTest-server-189428375',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='np0035104604',hostname='tempest-migrationsadmintest-server-189428375',id=24,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-08-30T14:25:34Z,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(12),node='np0035104604',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='7d2c92a904ff440e852cd97f87d075cd',ramdisk_id='',reservation_id='r-2nm7z0oy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',old_vm_state='active',owner_project_name='tempest-MigrationsAdminTest-1109016108',owner_user_name='tempest-MigrationsAdminTest-1109016108-project-member'},tags=,task_state='resize_migrating',terminated_at=None,trusted_certs=None,updated_at=2023-08-30T14:25:39Z,user_data=None,user_id='920a55a4ab704ae183b4770500f4b96c',uuid=231af25a-406a-40ff-bc7d-605b55a0cc97,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f0f50d17-b2cf-4731-b750-b2729686345f", "address": "fa:16:3e:17:a1:3d", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.72", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "shared", "vif_mac": "fa:16:3e:17:a1:3d"}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0f50d17-b2", "ovs_interfaceid": "f0f50d17-b2cf-4731-b750-b2729686345f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Aug 30 14:25:42.733752 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Converting VIF {"id": "f0f50d17-b2cf-4731-b750-b2729686345f", "address": "fa:16:3e:17:a1:3d", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.72", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "shared", "vif_mac": "fa:16:3e:17:a1:3d"}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0f50d17-b2", "ovs_interfaceid": "f0f50d17-b2cf-4731-b750-b2729686345f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:25:42.734527 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:17:a1:3d,bridge_name='br-int',has_traffic_filtering=True,id=f0f50d17-b2cf-4731-b750-b2729686345f,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0f50d17-b2') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:25:42.735303 np0035104604 nova-compute[107505]: DEBUG os_vif [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:17:a1:3d,bridge_name='br-int',has_traffic_filtering=True,id=f0f50d17-b2cf-4731-b750-b2729686345f,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0f50d17-b2') {{(pid=107505) unplug /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:109}} Aug 30 14:25:42.737227 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:25:42.737615 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf0f50d17-b2, bridge=br-int, if_exists=True) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:25:42.738766 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:25:42.742118 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:25:42.743767 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:25:42.747943 np0035104604 nova-compute[107505]: INFO os_vif [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:17:a1:3d,bridge_name='br-int',has_traffic_filtering=True,id=f0f50d17-b2cf-4731-b750-b2729686345f,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0f50d17-b2') Aug 30 14:25:42.753869 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] skipping disk for instance-00000018 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:25:42.754158 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] skipping disk for instance-00000018 as it does not have a path {{(pid=107505) _get_instance_disk_info_from_config /opt/stack/nova/nova/virt/libvirt/driver.py:11207}} Aug 30 14:25:42.934261 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Port f0f50d17-b2cf-4731-b750-b2729686345f binding to destination host np0035104604 is already ACTIVE {{(pid=107505) migrate_instance_start /opt/stack/nova/nova/network/neutron.py:3163}} Aug 30 14:25:43.036279 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Acquiring lock "231af25a-406a-40ff-bc7d-605b55a0cc97-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:25:43.036606 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Lock "231af25a-406a-40ff-bc7d-605b55a0cc97-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:25:43.036913 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Lock "231af25a-406a-40ff-bc7d-605b55a0cc97-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:25:43.250470 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Acquiring lock "refresh_cache-231af25a-406a-40ff-bc7d-605b55a0cc97" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:25:43.250993 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Acquired lock "refresh_cache-231af25a-406a-40ff-bc7d-605b55a0cc97" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:25:43.251354 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Building network info cache for instance {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} Aug 30 14:25:44.440570 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Updating instance_info_cache with network_info: [{"id": "f0f50d17-b2cf-4731-b750-b2729686345f", "address": "fa:16:3e:17:a1:3d", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.72", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0f50d17-b2", "ovs_interfaceid": "f0f50d17-b2cf-4731-b750-b2729686345f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:25:44.457013 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Releasing lock "refresh_cache-231af25a-406a-40ff-bc7d-605b55a0cc97" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:25:44.728222 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Starting finish_migration {{(pid=107505) finish_migration /opt/stack/nova/nova/virt/libvirt/driver.py:11674}} Aug 30 14:25:44.729296 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Instance directory exists: not creating {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4713}} Aug 30 14:25:44.729545 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Creating image(s) Aug 30 14:25:44.763487 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] creating snapshot(nova-resize) on rbd image(231af25a-406a-40ff-bc7d-605b55a0cc97_disk) {{(pid=107505) create_snap /opt/stack/nova/nova/storage/rbd_utils.py:462}} Aug 30 14:25:45.052558 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-09cd71cf-0f5f-4e2a-ba3d-441f515cf97b req-ee56d626-fde6-457e-9f31-d5a3dfa4c5e0 service nova] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Received event network-vif-plugged-f0f50d17-b2cf-4731-b750-b2729686345f {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:25:45.052872 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-09cd71cf-0f5f-4e2a-ba3d-441f515cf97b req-ee56d626-fde6-457e-9f31-d5a3dfa4c5e0 service nova] Acquiring lock "231af25a-406a-40ff-bc7d-605b55a0cc97-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:25:45.053191 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-09cd71cf-0f5f-4e2a-ba3d-441f515cf97b req-ee56d626-fde6-457e-9f31-d5a3dfa4c5e0 service nova] Lock "231af25a-406a-40ff-bc7d-605b55a0cc97-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:25:45.053927 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-09cd71cf-0f5f-4e2a-ba3d-441f515cf97b req-ee56d626-fde6-457e-9f31-d5a3dfa4c5e0 service nova] Lock "231af25a-406a-40ff-bc7d-605b55a0cc97-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:25:45.053927 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-09cd71cf-0f5f-4e2a-ba3d-441f515cf97b req-ee56d626-fde6-457e-9f31-d5a3dfa4c5e0 service nova] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] No waiting events found dispatching network-vif-plugged-f0f50d17-b2cf-4731-b750-b2729686345f {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:25:45.054168 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [req-09cd71cf-0f5f-4e2a-ba3d-441f515cf97b req-ee56d626-fde6-457e-9f31-d5a3dfa4c5e0 service nova] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Received unexpected event network-vif-plugged-f0f50d17-b2cf-4731-b750-b2729686345f for instance with vm_state active and task_state resize_finish. Aug 30 14:25:45.430986 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Did not create local disks {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4853}} Aug 30 14:25:45.431328 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Ensure instance console log exists: /opt/stack/data/nova/instances/231af25a-406a-40ff-bc7d-605b55a0cc97/console.log {{(pid=107505) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4603}} Aug 30 14:25:45.432137 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:25:45.432550 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:25:45.432912 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:25:45.436967 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Start _get_guest_xml network_info=[{"id": "f0f50d17-b2cf-4731-b750-b2729686345f", "address": "fa:16:3e:17:a1:3d", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.72", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "shared", "vif_mac": "fa:16:3e:17:a1:3d"}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0f50d17-b2", "ovs_interfaceid": "f0f50d17-b2cf-4731-b750-b2729686345f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'ide', 'dev': 'hda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-08-30T14:01:32Z,direct_url=,disk_format='qcow2',id=a32d892d-d3ba-4b7e-9bab-7e06f730b9e0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='7f1ed4be4a61468fa9c662d2b5fbcb56',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-08-30T14:01:43Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'encrypted': False, 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'disk_bus': 'virtio', 'image_id': 'a32d892d-d3ba-4b7e-9bab-7e06f730b9e0'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=107505) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7553}} Aug 30 14:25:45.443247 np0035104604 nova-compute[107505]: WARNING nova.virt.libvirt.driver [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Aug 30 14:25:45.446367 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Searching host: 'np0035104604' for CPU controller through CGroups V1... {{(pid=107505) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1650}} Aug 30 14:25:45.447103 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] CPU controller missing on host. {{(pid=107505) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1660}} Aug 30 14:25:45.449140 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Searching host: 'np0035104604' for CPU controller through CGroups V2... {{(pid=107505) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1669}} Aug 30 14:25:45.449578 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] CPU controller found on host. {{(pid=107505) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1676}} Aug 30 14:25:45.451692 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=107505) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5390}} Aug 30 14:25:45.452550 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-08-30T14:02:40Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='84',id=12,is_public=True,memory_mb=192,name='m1.micro',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-08-30T14:01:32Z,direct_url=,disk_format='qcow2',id=a32d892d-d3ba-4b7e-9bab-7e06f730b9e0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='7f1ed4be4a61468fa9c662d2b5fbcb56',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-08-30T14:01:43Z,virtual_size=,visibility=), allow threads: True {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Aug 30 14:25:45.453236 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Flavor limits 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Aug 30 14:25:45.453844 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Image limits 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Aug 30 14:25:45.454305 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Flavor pref 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Aug 30 14:25:45.454680 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Image pref 0:0:0 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Aug 30 14:25:45.455141 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=107505) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Aug 30 14:25:45.455638 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Aug 30 14:25:45.456057 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=107505) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Aug 30 14:25:45.456558 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Got 1 possible topologies {{(pid=107505) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Aug 30 14:25:45.457013 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Aug 30 14:25:45.457513 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=107505) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Aug 30 14:25:45.470625 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:25:46.074677 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.603s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:25:46.116931 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Running cmd (subprocess): ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:25:46.714618 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] CMD "ceph mon dump --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.598s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:25:46.716903 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2023-08-30T14:25:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-MigrationsAdminTest-server-189428375',display_name='tempest-MigrationsAdminTest-server-189428375',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(12),hidden=False,host='np0035104604',hostname='tempest-migrationsadmintest-server-189428375',id=24,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=12,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-08-30T14:25:34Z,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(12),node='np0035104604',numa_topology=None,old_flavor=Flavor(11),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='7d2c92a904ff440e852cd97f87d075cd',ramdisk_id='',reservation_id='r-2nm7z0oy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',old_vm_state='active',owner_project_name='tempest-MigrationsAdminTest-1109016108',owner_user_name='tempest-MigrationsAdminTest-1109016108-project-member'},tags=,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2023-08-30T14:25:43Z,user_data=None,user_id='920a55a4ab704ae183b4770500f4b96c',uuid=231af25a-406a-40ff-bc7d-605b55a0cc97,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f0f50d17-b2cf-4731-b750-b2729686345f", "address": "fa:16:3e:17:a1:3d", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.72", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "shared", "vif_mac": "fa:16:3e:17:a1:3d"}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0f50d17-b2", "ovs_interfaceid": "f0f50d17-b2cf-4731-b750-b2729686345f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=qemu {{(pid=107505) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Aug 30 14:25:46.717316 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Converting VIF {"id": "f0f50d17-b2cf-4731-b750-b2729686345f", "address": "fa:16:3e:17:a1:3d", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.72", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "shared", "vif_mac": "fa:16:3e:17:a1:3d"}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0f50d17-b2", "ovs_interfaceid": "f0f50d17-b2cf-4731-b750-b2729686345f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:25:46.718614 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:17:a1:3d,bridge_name='br-int',has_traffic_filtering=True,id=f0f50d17-b2cf-4731-b750-b2729686345f,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0f50d17-b2') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] End _get_guest_xml xml= Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: 231af25a-406a-40ff-bc7d-605b55a0cc97 Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: instance-00000018 Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: 196608 Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: 1 Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: tempest-MigrationsAdminTest-server-189428375 Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: 2023-08-30 14:25:45 Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: 192 Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: 1 Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: 0 Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: 0 Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: 1 Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: tempest-MigrationsAdminTest-1109016108-project-member Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: tempest-MigrationsAdminTest-1109016108 Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: OpenStack Foundation Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: OpenStack Nova Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: 27.1.0 Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: 231af25a-406a-40ff-bc7d-605b55a0cc97 Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: 231af25a-406a-40ff-bc7d-605b55a0cc97 Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Virtual Machine Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: hvm Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Nehalem Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: /dev/urandom Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: Aug 30 14:25:46.722046 np0035104604 nova-compute[107505]: {{(pid=107505) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7559}} Aug 30 14:25:46.725228 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.vif [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2023-08-30T14:25:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-MigrationsAdminTest-server-189428375',display_name='tempest-MigrationsAdminTest-server-189428375',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(12),hidden=False,host='np0035104604',hostname='tempest-migrationsadmintest-server-189428375',id=24,image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',info_cache=InstanceInfoCache,instance_type_id=12,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-08-30T14:25:34Z,launched_on='np0035104604',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(12),node='np0035104604',numa_topology=None,old_flavor=Flavor(11),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='7d2c92a904ff440e852cd97f87d075cd',ramdisk_id='',reservation_id='r-2nm7z0oy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32d892d-d3ba-4b7e-9bab-7e06f730b9e0',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',old_vm_state='active',owner_project_name='tempest-MigrationsAdminTest-1109016108',owner_user_name='tempest-MigrationsAdminTest-1109016108-project-member'},tags=,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2023-08-30T14:25:43Z,user_data=None,user_id='920a55a4ab704ae183b4770500f4b96c',uuid=231af25a-406a-40ff-bc7d-605b55a0cc97,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f0f50d17-b2cf-4731-b750-b2729686345f", "address": "fa:16:3e:17:a1:3d", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.72", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "shared", "vif_mac": "fa:16:3e:17:a1:3d"}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0f50d17-b2", "ovs_interfaceid": "f0f50d17-b2cf-4731-b750-b2729686345f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Aug 30 14:25:46.725228 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Converting VIF {"id": "f0f50d17-b2cf-4731-b750-b2729686345f", "address": "fa:16:3e:17:a1:3d", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.72", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "shared", "vif_mac": "fa:16:3e:17:a1:3d"}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0f50d17-b2", "ovs_interfaceid": "f0f50d17-b2cf-4731-b750-b2729686345f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Aug 30 14:25:46.725228 np0035104604 nova-compute[107505]: DEBUG nova.network.os_vif_util [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:17:a1:3d,bridge_name='br-int',has_traffic_filtering=True,id=f0f50d17-b2cf-4731-b750-b2729686345f,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0f50d17-b2') {{(pid=107505) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Aug 30 14:25:46.725552 np0035104604 nova-compute[107505]: DEBUG os_vif [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:17:a1:3d,bridge_name='br-int',has_traffic_filtering=True,id=f0f50d17-b2cf-4731-b750-b2729686345f,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0f50d17-b2') {{(pid=107505) plug /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:76}} Aug 30 14:25:46.725552 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:25:46.726182 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:25:46.726652 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Aug 30 14:25:46.729966 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 16 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:25:46.730328 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf0f50d17-b2, may_exist=True, interface_attrs={}) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:25:46.730870 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf0f50d17-b2, col_values=(('external_ids', {'iface-id': 'f0f50d17-b2cf-4731-b750-b2729686345f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:17:a1:3d', 'vm-uuid': '231af25a-406a-40ff-bc7d-605b55a0cc97'}),), if_exists=True) {{(pid=107505) do_commit /opt/stack/data/venv/lib/python3.10/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Aug 30 14:25:46.732704 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:25:46.737433 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:248}} Aug 30 14:25:46.741068 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:25:46.742897 np0035104604 nova-compute[107505]: INFO os_vif [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:17:a1:3d,bridge_name='br-int',has_traffic_filtering=True,id=f0f50d17-b2cf-4731-b750-b2729686345f,network=Network(ca7fb427-3840-468e-9b3e-32c12834c98b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0f50d17-b2') Aug 30 14:25:46.782554 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] No BDM found with device name vda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:25:46.782931 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] No BDM found with device name hda, not building metadata. {{(pid=107505) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Aug 30 14:25:46.783345 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] No VIF found with MAC fa:16:3e:17:a1:3d, not building metadata {{(pid=107505) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12068}} Aug 30 14:25:46.784405 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Using config drive Aug 30 14:25:46.845412 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:25:46.864693 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:25:46.870287 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:25:46.873572 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:25:47.112994 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-c4da1016-11ff-420a-8df8-4b1bc388ffd7 req-82d4ba98-5667-4249-8810-8d6d4b74516a service nova] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Received event network-vif-plugged-f0f50d17-b2cf-4731-b750-b2729686345f {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:25:47.113530 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-c4da1016-11ff-420a-8df8-4b1bc388ffd7 req-82d4ba98-5667-4249-8810-8d6d4b74516a service nova] Acquiring lock "231af25a-406a-40ff-bc7d-605b55a0cc97-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:25:47.114438 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-c4da1016-11ff-420a-8df8-4b1bc388ffd7 req-82d4ba98-5667-4249-8810-8d6d4b74516a service nova] Lock "231af25a-406a-40ff-bc7d-605b55a0cc97-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:25:47.114889 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-c4da1016-11ff-420a-8df8-4b1bc388ffd7 req-82d4ba98-5667-4249-8810-8d6d4b74516a service nova] Lock "231af25a-406a-40ff-bc7d-605b55a0cc97-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:25:47.115331 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-c4da1016-11ff-420a-8df8-4b1bc388ffd7 req-82d4ba98-5667-4249-8810-8d6d4b74516a service nova] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] No waiting events found dispatching network-vif-plugged-f0f50d17-b2cf-4731-b750-b2729686345f {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:25:47.115839 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [req-c4da1016-11ff-420a-8df8-4b1bc388ffd7 req-82d4ba98-5667-4249-8810-8d6d4b74516a service nova] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Received unexpected event network-vif-plugged-f0f50d17-b2cf-4731-b750-b2729686345f for instance with vm_state active and task_state resize_finish. Aug 30 14:25:47.201424 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:25:47.207545 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:25:47.211953 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:25:47.223039 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:25:47.230214 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:25:47.757767 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.host [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Removed pending event for 231af25a-406a-40ff-bc7d-605b55a0cc97 due to event {{(pid=107505) _event_emit_delayed /opt/stack/nova/nova/virt/libvirt/host.py:438}} Aug 30 14:25:47.757767 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Resumed> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:25:47.757767 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] VM Resumed (Lifecycle Event) Aug 30 14:25:47.757767 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Instance event wait completed in 0 seconds for {{(pid=107505) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Aug 30 14:25:47.757767 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [-] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Instance running successfully. Aug 30 14:25:47.759748 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.guest [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Failed to set time: agent not configured {{(pid=107505) sync_guest_time /opt/stack/nova/nova/virt/libvirt/guest.py:200}} Aug 30 14:25:47.760096 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-92d08542-d7c2-418b-9be8-ab3c34c5086b tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] finish_migration finished successfully. {{(pid=107505) finish_migration /opt/stack/nova/nova/virt/libvirt/driver.py:11769}} Aug 30 14:25:47.770157 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:25:47.774911 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:25:47.797523 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] During sync_power_state the instance has a pending task (resize_finish). Skip. Aug 30 14:25:47.798046 np0035104604 nova-compute[107505]: DEBUG nova.virt.driver [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] Emitting event Started> {{(pid=107505) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Aug 30 14:25:47.798449 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] VM Started (Lifecycle Event) Aug 30 14:25:47.819454 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Checking state {{(pid=107505) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Aug 30 14:25:47.826564 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 {{(pid=107505) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Aug 30 14:25:47.854865 np0035104604 nova-compute[107505]: INFO nova.compute.manager [None req-e92c1aa2-8833-4538-8cb3-a951c911fb42 None None] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] During sync_power_state the instance has a pending task (resize_finish). Skip. Aug 30 14:25:48.935301 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:25:49.291098 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-a64a1feb-5be5-461b-ba2b-ca4ff1c92185 req-864005aa-e9f4-4d61-ba91-21d3de5331b6 service nova] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Received event network-vif-plugged-f0f50d17-b2cf-4731-b750-b2729686345f {{(pid=107505) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Aug 30 14:25:49.291423 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-a64a1feb-5be5-461b-ba2b-ca4ff1c92185 req-864005aa-e9f4-4d61-ba91-21d3de5331b6 service nova] Acquiring lock "231af25a-406a-40ff-bc7d-605b55a0cc97-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:25:49.291743 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-a64a1feb-5be5-461b-ba2b-ca4ff1c92185 req-864005aa-e9f4-4d61-ba91-21d3de5331b6 service nova] Lock "231af25a-406a-40ff-bc7d-605b55a0cc97-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:25:49.291987 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [req-a64a1feb-5be5-461b-ba2b-ca4ff1c92185 req-864005aa-e9f4-4d61-ba91-21d3de5331b6 service nova] Lock "231af25a-406a-40ff-bc7d-605b55a0cc97-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:25:49.292295 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [req-a64a1feb-5be5-461b-ba2b-ca4ff1c92185 req-864005aa-e9f4-4d61-ba91-21d3de5331b6 service nova] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] No waiting events found dispatching network-vif-plugged-f0f50d17-b2cf-4731-b750-b2729686345f {{(pid=107505) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Aug 30 14:25:49.292545 np0035104604 nova-compute[107505]: WARNING nova.compute.manager [req-a64a1feb-5be5-461b-ba2b-ca4ff1c92185 req-864005aa-e9f4-4d61-ba91-21d3de5331b6 service nova] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Received unexpected event network-vif-plugged-f0f50d17-b2cf-4731-b750-b2729686345f for instance with vm_state resized and task_state None. Aug 30 14:25:49.619194 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-f462ac5e-f2cf-47c9-880d-bbd0fbe81fdd tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Acquiring lock "231af25a-406a-40ff-bc7d-605b55a0cc97" by "nova.compute.manager.ComputeManager.confirm_resize..do_confirm_resize" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:25:49.619766 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-f462ac5e-f2cf-47c9-880d-bbd0fbe81fdd tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Lock "231af25a-406a-40ff-bc7d-605b55a0cc97" acquired by "nova.compute.manager.ComputeManager.confirm_resize..do_confirm_resize" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:25:49.620288 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-f462ac5e-f2cf-47c9-880d-bbd0fbe81fdd tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Going to confirm migration 1 {{(pid=107505) do_confirm_resize /opt/stack/nova/nova/compute/manager.py:4694}} Aug 30 14:25:49.928094 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-f462ac5e-f2cf-47c9-880d-bbd0fbe81fdd tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Acquiring lock "refresh_cache-231af25a-406a-40ff-bc7d-605b55a0cc97" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} Aug 30 14:25:49.928512 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-f462ac5e-f2cf-47c9-880d-bbd0fbe81fdd tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Acquired lock "refresh_cache-231af25a-406a-40ff-bc7d-605b55a0cc97" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} Aug 30 14:25:49.928991 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-f462ac5e-f2cf-47c9-880d-bbd0fbe81fdd tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Building network info cache for instance {{(pid=107505) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} Aug 30 14:25:49.929508 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-f462ac5e-f2cf-47c9-880d-bbd0fbe81fdd tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Lazy-loading 'info_cache' on Instance uuid 231af25a-406a-40ff-bc7d-605b55a0cc97 {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:25:51.232963 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-f462ac5e-f2cf-47c9-880d-bbd0fbe81fdd tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 231af25a-406a-40ff-bc7d-605b55a0cc97] Updating instance_info_cache with network_info: [{"id": "f0f50d17-b2cf-4731-b750-b2729686345f", "address": "fa:16:3e:17:a1:3d", "network": {"id": "ca7fb427-3840-468e-9b3e-32c12834c98b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.72", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.1"}}], "meta": {"injected": false, "tenant_id": "7f1ed4be4a61468fa9c662d2b5fbcb56", "mtu": 1372, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0f50d17-b2", "ovs_interfaceid": "f0f50d17-b2cf-4731-b750-b2729686345f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=107505) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Aug 30 14:25:51.255778 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-f462ac5e-f2cf-47c9-880d-bbd0fbe81fdd tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Releasing lock "refresh_cache-231af25a-406a-40ff-bc7d-605b55a0cc97" {{(pid=107505) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} Aug 30 14:25:51.256314 np0035104604 nova-compute[107505]: DEBUG nova.objects.instance [None req-f462ac5e-f2cf-47c9-880d-bbd0fbe81fdd tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Lazy-loading 'migration_context' on Instance uuid 231af25a-406a-40ff-bc7d-605b55a0cc97 {{(pid=107505) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} Aug 30 14:25:51.277239 np0035104604 nova-compute[107505]: DEBUG nova.objects.base [None req-f462ac5e-f2cf-47c9-880d-bbd0fbe81fdd tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Object Instance<231af25a-406a-40ff-bc7d-605b55a0cc97> lazy-loaded attributes: info_cache,migration_context {{(pid=107505) wrapper /opt/stack/nova/nova/objects/base.py:126}} Aug 30 14:25:51.348295 np0035104604 nova-compute[107505]: DEBUG nova.storage.rbd_utils [None req-f462ac5e-f2cf-47c9-880d-bbd0fbe81fdd tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] removing snapshot(nova-resize) on rbd image(231af25a-406a-40ff-bc7d-605b55a0cc97_disk) {{(pid=107505) remove_snap /opt/stack/nova/nova/storage/rbd_utils.py:489}} Aug 30 14:25:51.735026 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:25:52.354571 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-f462ac5e-f2cf-47c9-880d-bbd0fbe81fdd tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:25:52.355160 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-f462ac5e-f2cf-47c9-880d-bbd0fbe81fdd tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:25:53.682166 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-f462ac5e-f2cf-47c9-880d-bbd0fbe81fdd tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:25:53.936850 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:25:54.538582 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-f462ac5e-f2cf-47c9-880d-bbd0fbe81fdd tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.857s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:25:54.545481 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-f462ac5e-f2cf-47c9-880d-bbd0fbe81fdd tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:25:54.568807 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-f462ac5e-f2cf-47c9-880d-bbd0fbe81fdd tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:25:54.643037 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-f462ac5e-f2cf-47c9-880d-bbd0fbe81fdd tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 2.288s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:25:55.069685 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-b6ac6ef2-8f8b-4d0f-821d-f080c9813c50 tempest-ImagesOneServerTestJSON-427126162 tempest-ImagesOneServerTestJSON-427126162-project-member] Acquiring lock "80874308-ea97-470f-b592-2c218f708ac6" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:25:55.121706 np0035104604 nova-compute[107505]: INFO nova.scheduler.client.report [None req-f462ac5e-f2cf-47c9-880d-bbd0fbe81fdd tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Deleted allocation for migration 379d1aeb-e026-42e2-82b3-0ed1adfe38c8 Aug 30 14:25:55.227379 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-f462ac5e-f2cf-47c9-880d-bbd0fbe81fdd tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Lock "231af25a-406a-40ff-bc7d-605b55a0cc97" "released" by "nova.compute.manager.ComputeManager.confirm_resize..do_confirm_resize" :: held 5.608s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:25:56.738246 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:26:00.371591 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-8c4e78f2-92d5-4ae6-ad2d-caecde3ee2d6 tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Acquiring lock "7c62e098-aae7-47b1-908a-a20ee3adb41c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:26:00.372069 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-8c4e78f2-92d5-4ae6-ad2d-caecde3ee2d6 tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Lock "7c62e098-aae7-47b1-908a-a20ee3adb41c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:26:00.390724 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-8c4e78f2-92d5-4ae6-ad2d-caecde3ee2d6 tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 7c62e098-aae7-47b1-908a-a20ee3adb41c] Starting instance... {{(pid=107505) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} Aug 30 14:26:00.585522 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-8c4e78f2-92d5-4ae6-ad2d-caecde3ee2d6 tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} Aug 30 14:26:00.585893 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-8c4e78f2-92d5-4ae6-ad2d-caecde3ee2d6 tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} Aug 30 14:26:00.590357 np0035104604 nova-compute[107505]: DEBUG nova.virt.hardware [None req-8c4e78f2-92d5-4ae6-ad2d-caecde3ee2d6 tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=107505) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Aug 30 14:26:00.590621 np0035104604 nova-compute[107505]: INFO nova.compute.claims [None req-8c4e78f2-92d5-4ae6-ad2d-caecde3ee2d6 tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 7c62e098-aae7-47b1-908a-a20ee3adb41c] Claim successful on node np0035104604 Aug 30 14:26:01.766827 np0035104604 nova-compute[107505]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=107505) __log_wakeup /opt/stack/data/venv/lib/python3.10/site-packages/ovs/poller.py:263}} Aug 30 14:26:01.787706 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-8c4e78f2-92d5-4ae6-ad2d-caecde3ee2d6 tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Running cmd (subprocess): ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} Aug 30 14:26:02.465302 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.processutils [None req-8c4e78f2-92d5-4ae6-ad2d-caecde3ee2d6 tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] CMD "ceph df --format=json --id cinder --conf /etc/ceph/ceph.conf" returned: 0 in 0.677s {{(pid=107505) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} Aug 30 14:26:02.470845 np0035104604 nova-compute[107505]: DEBUG nova.compute.provider_tree [None req-8c4e78f2-92d5-4ae6-ad2d-caecde3ee2d6 tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Inventory has not changed in ProviderTree for provider: 600ab55f-530c-4be6-bf02-067d68ce7ee4 {{(pid=107505) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Aug 30 14:26:02.488559 np0035104604 nova-compute[107505]: DEBUG nova.scheduler.client.report [None req-8c4e78f2-92d5-4ae6-ad2d-caecde3ee2d6 tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Inventory has not changed for provider 600ab55f-530c-4be6-bf02-067d68ce7ee4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7746, 'reserved': 512, 'min_unit': 1, 'max_unit': 7746, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 29, 'reserved': 0, 'min_unit': 1, 'max_unit': 29, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=107505) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Aug 30 14:26:02.518216 np0035104604 nova-compute[107505]: DEBUG oslo_concurrency.lockutils [None req-8c4e78f2-92d5-4ae6-ad2d-caecde3ee2d6 tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.932s {{(pid=107505) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} Aug 30 14:26:02.519474 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-8c4e78f2-92d5-4ae6-ad2d-caecde3ee2d6 tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 7c62e098-aae7-47b1-908a-a20ee3adb41c] Start building networks asynchronously for instance. {{(pid=107505) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} Aug 30 14:26:02.587814 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-8c4e78f2-92d5-4ae6-ad2d-caecde3ee2d6 tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 7c62e098-aae7-47b1-908a-a20ee3adb41c] Allocating IP information in the background. {{(pid=107505) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} Aug 30 14:26:02.588327 np0035104604 nova-compute[107505]: DEBUG nova.network.neutron [None req-8c4e78f2-92d5-4ae6-ad2d-caecde3ee2d6 tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 7c62e098-aae7-47b1-908a-a20ee3adb41c] allocate_for_instance() {{(pid=107505) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} Aug 30 14:26:02.717719 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-8c4e78f2-92d5-4ae6-ad2d-caecde3ee2d6 tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 7c62e098-aae7-47b1-908a-a20ee3adb41c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Aug 30 14:26:02.738430 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-8c4e78f2-92d5-4ae6-ad2d-caecde3ee2d6 tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 7c62e098-aae7-47b1-908a-a20ee3adb41c] Start building block device mappings for instance. {{(pid=107505) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} Aug 30 14:26:02.915254 np0035104604 nova-compute[107505]: DEBUG nova.compute.manager [None req-8c4e78f2-92d5-4ae6-ad2d-caecde3ee2d6 tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 7c62e098-aae7-47b1-908a-a20ee3adb41c] Start spawning the instance on the hypervisor. {{(pid=107505) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} Aug 30 14:26:02.916349 np0035104604 nova-compute[107505]: DEBUG nova.virt.libvirt.driver [None req-8c4e78f2-92d5-4ae6-ad2d-caecde3ee2d6 tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 7c62e098-aae7-47b1-908a-a20ee3adb41c] Creating instance directory {{(pid=107505) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4717}} Aug 30 14:26:02.916682 np0035104604 nova-compute[107505]: INFO nova.virt.libvirt.driver [None req-8c4e78f2-92d5-4ae6-ad2d-caecde3ee2d6 tempest-MigrationsAdminTest-1109016108 tempest-MigrationsAdminTest-1109016108-project-member] [instance: 7c62e098-aae7-47b1-908a-a20ee3adb41c] Creating image(s)