[ubuntu 1.20-49~icehouse] BadRequest Error seen during VM creation causing it to be stuckin Scheduling state

Bug #1381028 reported by Ganesha HV
6
This bug affects 1 person
Affects Status Importance Assigned to Milestone
Juniper Openstack
Status tracked in Trunk
R1.1
Fix Released
High
Ganesha HV
Trunk
Fix Released
High
Ganesha HV

Bug Description

env.roledefs = {
    'all': [host1, host2, host3, host4, host5],
    'cfgm': [host1],
    'control': [host2, host3],
    'compute': [host4, host5],
    'collector': [host1],
    'openstack': [host1],
    'webui': [host1],
    'database': [host1],
    'build': [host_build],
}

env.hostnames = {
     'all': ['nodea26', 'nodeb12-1', 'nodeb12-2', 'nodeg16', 'nodeg26']
}

During VM bringup, the following error was seen in the neutron/server.log :

2014-10-14 03:09:08.290 ERROR [neutron.notifiers.nova] Failed to notify nova on events: [{'name': 'network-changed', 'server_uuid': u'20d25d11-44d7-46c7-97f4-a7163aaece79'}, {'name': 'network-changed', 'server_uuid': u'20d25d11-44d7-46c7-97f4-a7163aaece79'}, {'name': 'network-changed', 'server_uuid': u'20d25d11-44d7-46c7-97f4-a7163aaece79'}]
Traceback (most recent call last):
  File "/usr/lib/python2.7/dist-packages/neutron/notifiers/nova.py", line 187, in send_events
    batched_events)
  File "/usr/lib/python2.7/dist-packages/novaclient/v1_1/contrib/server_external_events.py", line 39, in create
    return_raw=True)
  File "/usr/lib/python2.7/dist-packages/novaclient/base.py", line 152, in _create
    _resp, body = self.api.client.post(url, body=body)
  File "/usr/lib/python2.7/dist-packages/novaclient/client.py", line 286, in post
    return self._cs_request(url, 'POST', **kwargs)
  File "/usr/lib/python2.7/dist-packages/novaclient/client.py", line 260, in _cs_request
    **kwargs)
  File "/usr/lib/python2.7/dist-packages/novaclient/client.py", line 242, in _time_request
    resp, body = self.request(url, method, **kwargs)
  File "/usr/lib/python2.7/dist-packages/novaclient/client.py", line 236, in request
    raise exceptions.from_response(resp, body, url, method)
BadRequest: The server could not comply with the request since it is either malformed or otherwise incorrect. (HTTP 400)
2014-10-14 03:09:10.289 ERROR [neutron.notifiers.nova] Failed to notify nova on events: [{'name': 'network-changed', 'server_uuid': u'df8d47a3-14a8-4081-b9ec-76c3c2d13156'}, {'name': 'network-changed', 'server_uuid': u'df8d47a3-14a8-4081-b9ec-76c3c2d13156'}, {'name': 'network-changed', 'server_uuid': u'df8d47a3-14a8-4081-b9ec-76c3c2d13156'}]
Traceback (most recent call last):
  File "/usr/lib/python2.7/dist-packages/neutron/notifiers/nova.py", line 187, in send_events
    batched_events)
  File "/usr/lib/python2.7/dist-packages/novaclient/v1_1/contrib/server_external_events.py", line 39, in create
    return_raw=True)
  File "/usr/lib/python2.7/dist-packages/novaclient/base.py", line 152, in _create
    _resp, body = self.api.client.post(url, body=body)
  File "/usr/lib/python2.7/dist-packages/novaclient/client.py", line 286, in post
    return self._cs_request(url, 'POST', **kwargs)
  File "/usr/lib/python2.7/dist-packages/novaclient/client.py", line 260, in _cs_request
    **kwargs)
  File "/usr/lib/python2.7/dist-packages/novaclient/client.py", line 242, in _time_request
    resp, body = self.request(url, method, **kwargs)
  File "/usr/lib/python2.7/dist-packages/novaclient/client.py", line 236, in request
    raise exceptions.from_response(resp, body, url, method)
BadRequest: The server could not comply with the request since it is either malformed or otherwise incorrect. (HTTP 400)

The following messages are seen on the nova-compute.log on the compute nodes(nodeg16 & nodeg26):

2014-10-14 03:15:50.641 1745 TRACE oslo.messaging._drivers.impl_rabbit Traceback (most recent call last):
2014-10-14 03:15:50.641 1745 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib/python2.7/dist-packages/oslo/messaging/_drivers/impl_rabbit.py", line 622, in ensure
2014-10-14 03:15:50.641 1745 TRACE oslo.messaging._drivers.impl_rabbit return method(*args, **kwargs)
2014-10-14 03:15:50.641 1745 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib/python2.7/dist-packages/oslo/messaging/_drivers/impl_rabbit.py", line 718, in _publish
2014-10-14 03:15:50.641 1745 TRACE oslo.messaging._drivers.impl_rabbit publisher = cls(self.conf, self.channel, topic, **kwargs)
2014-10-14 03:15:50.641 1745 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib/python2.7/dist-packages/oslo/messaging/_drivers/impl_rabbit.py", line 379, in __init__
2014-10-14 03:15:50.641 1745 TRACE oslo.messaging._drivers.impl_rabbit **options)
2014-10-14 03:15:50.641 1745 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib/python2.7/dist-packages/oslo/messaging/_drivers/impl_rabbit.py", line 326, in __init__
2014-10-14 03:15:50.641 1745 TRACE oslo.messaging._drivers.impl_rabbit self.reconnect(channel)
2014-10-14 03:15:50.641 1745 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib/python2.7/dist-packages/oslo/messaging/_drivers/impl_rabbit.py", line 334, in reconnect
2014-10-14 03:15:50.641 1745 TRACE oslo.messaging._drivers.impl_rabbit routing_key=self.routing_key)
2014-10-14 03:15:50.641 1745 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib/python2.7/dist-packages/kombu/messaging.py", line 82, in __init__
2014-10-14 03:15:50.641 1745 TRACE oslo.messaging._drivers.impl_rabbit self.revive(self._channel)
2014-10-14 03:15:50.641 1745 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib/python2.7/dist-packages/kombu/messaging.py", line 216, in revive
2014-10-14 03:15:50.641 1745 TRACE oslo.messaging._drivers.impl_rabbit self.declare()
2014-10-14 03:15:50.641 1745 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib/python2.7/dist-packages/kombu/messaging.py", line 102, in declare
2014-10-14 03:15:50.641 1745 TRACE oslo.messaging._drivers.impl_rabbit self.exchange.declare()
2014-10-14 03:15:50.641 1745 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib/python2.7/dist-packages/kombu/entity.py", line 166, in declare
2014-10-14 03:15:50.641 1745 TRACE oslo.messaging._drivers.impl_rabbit nowait=nowait, passive=passive,
2014-10-14 03:15:50.641 1745 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib/python2.7/dist-packages/amqp/channel.py", line 612, in exchange_declare
2014-10-14 03:15:50.641 1745 TRACE oslo.messaging._drivers.impl_rabbit (40, 11), # Channel.exchange_declare_ok
2014-10-14 03:15:50.641 1745 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib/python2.7/dist-packages/amqp/abstract_channel.py", line 73, in wait
2014-10-14 03:15:50.641 1745 TRACE oslo.messaging._drivers.impl_rabbit self.channel_id, allowed_methods)
2014-10-14 03:15:50.641 1745 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib/python2.7/dist-packages/amqp/connection.py", line 220, in _wait_method
2014-10-14 03:15:50.641 1745 TRACE oslo.messaging._drivers.impl_rabbit self.method_reader.read_method()
2014-10-14 03:15:50.641 1745 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib/python2.7/dist-packages/amqp/method_framing.py", line 195, in read_method
2014-10-14 03:15:50.641 1745 TRACE oslo.messaging._drivers.impl_rabbit raise m
2014-10-14 03:15:50.641 1745 TRACE oslo.messaging._drivers.impl_rabbit IOError: Socket closed
2014-10-14 03:15:50.641 1745 TRACE oslo.messaging._drivers.impl_rabbit
2014-10-14 03:15:50.643 1745 INFO oslo.messaging._drivers.impl_rabbit [-] Reconnecting to AMQP server on 22.22.22.22:5672
2014-10-14 03:15:50.643 1745 INFO oslo.messaging._drivers.impl_rabbit [-] Delaying reconnect for 1.0 seconds...
2014-10-14 03:15:51.655 1745 INFO oslo.messaging._drivers.impl_rabbit [-] Connected to AMQP server on 22.22.22.22:5672
2014-10-14 03:16:50.233 1745 ERROR nova.openstack.common.periodic_task [-] Error during ComputeManager.update_available_resource: Timed out waiting for a reply to message ID 4fbeb7c2f4a74c19979955151e2783f0
2014-10-14 03:16:50.233 1745 TRACE nova.openstack.common.periodic_task Traceback (most recent call last):
2014-10-14 03:16:50.233 1745 TRACE nova.openstack.common.periodic_task File "/usr/lib/python2.7/dist-packages/nova/openstack/common/periodic_task.py", line 182, in run_periodic_tasks
2014-10-14 03:16:50.233 1745 TRACE nova.openstack.common.periodic_task task(self, context)
2014-10-14 03:16:50.233 1745 TRACE nova.openstack.common.periodic_task File "/usr/lib/python2.7/dist-packages/nova/compute/manager.py", line 5405, in update_available_resource
2014-10-14 03:16:50.233 1745 TRACE nova.openstack.common.periodic_task rt.update_available_resource(context)
2014-10-14 03:16:50.233 1745 TRACE nova.openstack.common.periodic_task File "/usr/lib/python2.7/dist-packages/nova/openstack/common/lockutils.py", line 249, in inner
2014-10-14 03:16:50.233 1745 TRACE nova.openstack.common.periodic_task return f(*args, **kwargs)
2014-10-14 03:16:50.233 1745 TRACE nova.openstack.common.periodic_task File "/usr/lib/python2.7/dist-packages/nova/compute/resource_tracker.py", line 315, in update_available_resource
2014-10-14 03:16:50.233 1745 TRACE nova.openstack.common.periodic_task context, self.host, self.nodename)
2014-10-14 03:16:50.233 1745 TRACE nova.openstack.common.periodic_task File "/usr/lib/python2.7/dist-packages/nova/objects/base.py", line 110, in wrapper
2014-10-14 03:16:50.233 1745 TRACE nova.openstack.common.periodic_task args, kwargs)
2014-10-14 03:16:50.233 1745 TRACE nova.openstack.common.periodic_task File "/usr/lib/python2.7/dist-packages/nova/conductor/rpcapi.py", line 425, in object_class_action
2014-10-14 03:16:50.233 1745 TRACE nova.openstack.common.periodic_task objver=objver, args=args, kwargs=kwargs)
2014-10-14 03:16:50.233 1745 TRACE nova.openstack.common.periodic_task File "/usr/lib/python2.7/dist-packages/oslo/messaging/rpc/client.py", line 150, in call
2014-10-14 03:16:50.233 1745 TRACE nova.openstack.common.periodic_task wait_for_reply=True, timeout=timeout)
2014-10-14 03:16:50.233 1745 TRACE nova.openstack.common.periodic_task File "/usr/lib/python2.7/dist-packages/oslo/messaging/transport.py", line 90, in _send
2014-10-14 03:16:50.233 1745 TRACE nova.openstack.common.periodic_task timeout=timeout)
2014-10-14 03:16:50.233 1745 TRACE nova.openstack.common.periodic_task File "/usr/lib/python2.7/dist-packages/oslo/messaging/_drivers/amqpdriver.py", line 412, in send
2014-10-14 03:16:50.233 1745 TRACE nova.openstack.common.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout)
2014-10-14 03:16:50.233 1745 TRACE nova.openstack.common.periodic_task File "/usr/lib/python2.7/dist-packages/oslo/messaging/_drivers/amqpdriver.py", line 403, in _send
2014-10-14 03:16:50.233 1745 TRACE nova.openstack.common.periodic_task result = self._waiter.wait(msg_id, timeout)
2014-10-14 03:16:50.233 1745 TRACE nova.openstack.common.periodic_task File "/usr/lib/python2.7/dist-packages/oslo/messaging/_drivers/amqpdriver.py", line 267, in wait
2014-10-14 03:16:50.233 1745 TRACE nova.openstack.common.periodic_task reply, ending = self._poll_connection(msg_id, timeout)
2014-10-14 03:16:50.233 1745 TRACE nova.openstack.common.periodic_task File "/usr/lib/python2.7/dist-packages/oslo/messaging/_drivers/amqpdriver.py", line 217, in _poll_connection
2014-10-14 03:16:50.233 1745 TRACE nova.openstack.common.periodic_task % msg_id)
2014-10-14 03:16:50.233 1745 TRACE nova.openstack.common.periodic_task MessagingTimeout: Timed out waiting for a reply to message ID 4fbeb7c2f4a74c19979955151e2783f0
2014-10-14 03:16:50.233 1745 TRACE nova.openstack.common.periodic_task
2014-10-14 03:17:50.238 1745 ERROR nova.servicegroup.drivers.db [-] model server went away
2014-10-14 03:17:50.238 1745 TRACE nova.servicegroup.drivers.db Traceback (most recent call last):
2014-10-14 03:17:50.238 1745 TRACE nova.servicegroup.drivers.db File "/usr/lib/python2.7/dist-packages/nova/servicegroup/drivers/db.py", line 95, in _report_state
2014-10-14 03:17:50.238 1745 TRACE nova.servicegroup.drivers.db service.service_ref, state_catalog)
2014-10-14 03:17:50.238 1745 TRACE nova.servicegroup.drivers.db File "/usr/lib/python2.7/dist-packages/nova/conductor/api.py", line 218, in service_update
2014-10-14 03:17:50.238 1745 TRACE nova.servicegroup.drivers.db return self._manager.service_update(context, service, values)
2014-10-14 03:17:50.238 1745 TRACE nova.servicegroup.drivers.db File "/usr/lib/python2.7/dist-packages/nova/conductor/rpcapi.py", line 330, in service_update
2014-10-14 03:17:50.238 1745 TRACE nova.servicegroup.drivers.db service=service_p, values=values)
2014-10-14 03:17:50.238 1745 TRACE nova.servicegroup.drivers.db File "/usr/lib/python2.7/dist-packages/oslo/messaging/rpc/client.py", line 150, in call
2014-10-14 03:17:50.238 1745 TRACE nova.servicegroup.drivers.db wait_for_reply=True, timeout=timeout)
2014-10-14 03:17:50.238 1745 TRACE nova.servicegroup.drivers.db File "/usr/lib/python2.7/dist-packages/oslo/messaging/transport.py", line 90, in _send
2014-10-14 03:17:50.238 1745 TRACE nova.servicegroup.drivers.db timeout=timeout)
2014-10-14 03:17:50.238 1745 TRACE nova.servicegroup.drivers.db File "/usr/lib/python2.7/dist-packages/oslo/messaging/_drivers/amqpdriver.py", line 412, in send
2014-10-14 03:17:50.238 1745 TRACE nova.servicegroup.drivers.db return self._send(target, ctxt, message, wait_for_reply, timeout)
2014-10-14 03:17:50.238 1745 TRACE nova.servicegroup.drivers.db File "/usr/lib/python2.7/dist-packages/oslo/messaging/_drivers/amqpdriver.py", line 403, in _send
2014-10-14 03:17:50.238 1745 TRACE nova.servicegroup.drivers.db result = self._waiter.wait(msg_id, timeout)
2014-10-14 03:17:50.238 1745 TRACE nova.servicegroup.drivers.db File "/usr/lib/python2.7/dist-packages/oslo/messaging/_drivers/amqpdriver.py", line 267, in wait
2014-10-14 03:17:50.238 1745 TRACE nova.servicegroup.drivers.db reply, ending = self._poll_connection(msg_id, timeout)
2014-10-14 03:17:50.238 1745 TRACE nova.servicegroup.drivers.db File "/usr/lib/python2.7/dist-packages/oslo/messaging/_drivers/amqpdriver.py", line 217, in _poll_connection
2014-10-14 03:17:50.238 1745 TRACE nova.servicegroup.drivers.db % msg_id)
2014-10-14 03:17:50.238 1745 TRACE nova.servicegroup.drivers.db MessagingTimeout: Timed out waiting for a reply to message ID afceeea4895945dca0af15771c5fd4c7
2014-10-14 03:17:50.238 1745 TRACE nova.servicegroup.drivers.db

Setup is intact.

Revision history for this message
Ganesha HV (ganeshahv) wrote :

Attached the logs and cores at http://mayamruga.englab.juniper.net/bugs/<bug-ID>

information type: Proprietary → Public
tags: added: releasenote
Revision history for this message
Atul Moghe (moghea) wrote :

I could successfully launch VMs with 14.04/icehosue and 12.04/icehouse images on mainline and R2.0. Can you verify that.
We have recently upgraded juniper/nova icehouse branch with sync from upstream, which had fixes related to VMs and snapshots.

Revision history for this message
Ganesha HV (ganeshahv) wrote :

Not seeing the issue again. Closing the bug.

To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.