[ubuntu-icehouse-mainline-2301] instance launch failure with domain type kvm

Bug #1358128 reported by Prashant Shetty
6
This bug affects 1 person
Affects Status Importance Assigned to Milestone
Juniper Openstack
Fix Committed
High
Atul Moghe
R1.1
Fix Committed
High
Atul Moghe
R2.0
Fix Committed
Undecided
Atul Moghe

Bug Description

Instance launch is failing with below trace on mainline #2301. Can someone check below logs.

All logs/cores will be at /cs-shared/test_runs/nodeb9/2014_08_17_18_39_56 on nodeb10.englab.juniper.net

Logs:

2014-08-17 06:25:17.491 2307 INFO nova.virt.libvirt.driver [req-789eb6f4-9d4d-49c9-8cad-472e4c9f8349 2b7710368f564e1597dae5d874067448 aa71108b7fd2464fb7d067e38623d0a7] [instance:
 05992afe-acb0-424c-a201-12bb4a9da981] Creating image
2014-08-17 06:25:18.544 2307 ERROR nova.virt.libvirt.driver [req-789eb6f4-9d4d-49c9-8cad-472e4c9f8349 2b7710368f564e1597dae5d874067448 aa71108b7fd2464fb7d067e38623d0a7] An error
occurred while trying to define a domain with xml: <domain type="kvm">
  <uuid>05992afe-acb0-424c-a201-12bb4a9da981</uuid>
  <name>instance-00000005</name>
  <memory>1048576</memory>
  <vcpu>1</vcpu>
  <sysinfo type="smbios">
    <system>
      <entry name="manufacturer">OpenStack Foundation</entry>
      <entry name="product">OpenStack Nova</entry>
      <entry name="version">2014.1</entry>
      <entry name="serial">00000000-0000-0000-0000-00259093d252</entry>
      <entry name="uuid">05992afe-acb0-424c-a201-12bb4a9da981</entry>
    </system>
  </sysinfo>
  <os>
    <type>hvm</type>
    <boot dev="hd"/>
    <smbios mode="sysinfo"/>
  </os>
  <features>
    <acpi/>
    <apic/>
  </features>
  <clock offset="utc">
    <timer name="pit" tickpolicy="delay"/>
    <timer name="rtc" tickpolicy="catchup"/>
    <timer name="hpet" present="no"/>
  </clock>
  <devices>
    <disk type="file" device="disk">
      <driver name="qemu" type="qcow2" cache="none"/>
      <source file="/var/lib/nova/instances/05992afe-acb0-424c-a201-12bb4a9da981/disk"/>
      <target bus="virtio" dev="vda"/>
    </disk>
2014-08-17 06:25:18.545 2307 ERROR nova.compute.manager [req-789eb6f4-9d4d-49c9-8cad-472e4c9f8349 2b7710368f564e1597dae5d874067448 aa71108b7fd2464fb7d067e38623d0a7] [instance: 05992afe-acb0-424c-a201-12bb4a9da981] Instance failed to spawn
2014-08-17 06:25:18.545 2307 TRACE nova.compute.manager [instance: 05992afe-acb0-424c-a201-12bb4a9da981] Traceback (most recent call last):
2014-08-17 06:25:18.545 2307 TRACE nova.compute.manager [instance: 05992afe-acb0-424c-a201-12bb4a9da981] File "/usr/lib/python2.7/dist-packages/nova/compute/manager.py", line 1720, in _spawn
2014-08-17 06:25:18.545 2307 TRACE nova.compute.manager [instance: 05992afe-acb0-424c-a201-12bb4a9da981] block_device_info)
2014-08-17 06:25:18.545 2307 TRACE nova.compute.manager [instance: 05992afe-acb0-424c-a201-12bb4a9da981] File "/usr/lib/python2.7/dist-packages/nova/virt/libvirt/driver.py", line 2253, in spawn
2014-08-17 06:25:18.545 2307 TRACE nova.compute.manager [instance: 05992afe-acb0-424c-a201-12bb4a9da981] block_device_info)
2014-08-17 06:25:18.545 2307 TRACE nova.compute.manager [instance: 05992afe-acb0-424c-a201-12bb4a9da981] File "/usr/lib/python2.7/dist-packages/nova/virt/libvirt/driver.py", line 3644, in _create_domain_and_network
2014-08-17 06:25:18.545 2307 TRACE nova.compute.manager [instance: 05992afe-acb0-424c-a201-12bb4a9da981] power_on=power_on)
2014-08-17 06:25:18.545 2307 TRACE nova.compute.manager [instance: 05992afe-acb0-424c-a201-12bb4a9da981] File "/usr/lib/python2.7/dist-packages/nova/virt/libvirt/driver.py", line 3538, in _create_domain
2014-08-17 06:25:18.545 2307 TRACE nova.compute.manager [instance: 05992afe-acb0-424c-a201-12bb4a9da981] raise e
2014-08-17 06:25:18.545 2307 TRACE nova.compute.manager [instance: 05992afe-acb0-424c-a201-12bb4a9da981] libvirtError: internal error: no supported architecture for os type 'hvm'
2014-08-17 06:25:18.545 2307 TRACE nova.compute.manager [instance: 05992afe-acb0-424c-a201-12bb4a9da981]
2014-08-17 06:25:18.659 2307 AUDIT nova.compute.manager [req-789eb6f4-9d4d-49c9-8cad-472e4c9f8349 2b7710368f564e1597dae5d874067448 aa71108b7fd2464fb7d067e38623d0a7] [instance: 05992afe-acb0-424c-a201-12bb4a9da981] Terminating instance

Revision history for this message
shajuvk (shajuvk) wrote :

more logs are available from multinode icehouse build-22 inclusing installation logs are copied to below location
/cs-shared/shaju/bugs/vm-err-ichouse-logs

tags: added: blocker
information type: Proprietary → Public
Atul Moghe (moghea)
Changed in juniperopenstack:
status: New → In Progress
Atul Moghe (moghea)
Changed in juniperopenstack:
status: In Progress → Fix Committed
To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.