ceph ha fails on floating ip creation

Bug #1317538 reported by Vladimir Kuklin
6
This bug affects 1 person
Affects Status Importance Assigned to Milestone
Fuel for OpenStack
Invalid
High
Aleksandr Didenko

Bug Description

{"build_id": "2014-05-08_01-10-31", "mirantis": "yes", "build_number": "188", "ostf_sha": "fe718434f88f2ab167779770828a195f06eb29f8", "nailgun_sha": "82b9d42a7a5e9aa1caf6b2779c45ca045cad0ad2", "production": "docker", "api": "1.0", "fuelmain_sha": "97d7f6d5461db3afc27f58160cf9f6985230d255", "astute_sha": "9c83d3ecec69df03cd94620e2df92249ba4ec786", "release": "5.0", "fuellib_sha": "fd31d9a8f85136347b60377df00df6728eda14ca"}

deploy ceph ha with nova-network

1st controller deply fails here:

2014-05-08T10:43:04.254379+00:00 info: (/Stage[main]/Openstack::Ha::Nova/Openstack::Ha::Haproxy_service[nova-metadata-api]/Exec[haproxy reload for nova-metadata-api]) Star
ting to evaluate the resource
2014-05-08T10:43:04.258713+00:00 debug: (/Stage[main]/Openstack::Ha::Nova/Openstack::Ha::Haproxy_service[nova-metadata-api]/Exec[haproxy reload for nova-metadata-api]/retu
rns) Exec try 1/10
2014-05-08T10:43:04.258942+00:00 debug: (Exec[haproxy reload for nova-metadata-api](provider=shell)) Executing '/bin/sh-cexport OCF_ROOT="/usr/lib/ocf"; (ip netns list | g
rep haproxy) && ip netns exec haproxy /usr/lib/ocf/resource.d/mirantis/ns_haproxy reload'
2014-05-08T10:43:04.260852+00:00 debug: Executing '/bin/sh -c export OCF_ROOT="/usr/lib/ocf"; (ip netns list | grep haproxy) && ip netns exec haproxy /usr/lib/ocf/resource
.d/mirantis/ns_haproxy reload'
2014-05-08T10:43:04.323856+00:00 notice: (/Stage[main]/Openstack::Ha::Nova/Openstack::Ha::Haproxy_service[nova-metadata-api]/Exec[haproxy reload for nova-metadata-api]/ret
urns) haproxy
2014-05-08T10:43:04.324772+00:00 notice: (/Stage[main]/Openstack::Ha::Nova/Openstack::Ha::Haproxy_service[nova-metadata-api]/Exec[haproxy reload for nova-metadata-api]/ret
urns) 2014/05/08_10:43:04 INFO: haproxy daemon running
2014-05-08T10:43:04.325268+00:00 notice: (/Stage[main]/Openstack::Ha::Nova/Openstack::Ha::Haproxy_service[nova-metadata-api]/Exec[haproxy reload for nova-metadata-api]/ret
urns) 2014/05/08_10:43:04 INFO: [WARNING] 127/104304 (2613) : config : 'stats' statement ignored for proxy 'rabbitmq' as it requires HTTP mode. [WARNING] 127/104304 (2613)
: config : 'stats' statement ignored for proxy 'mysqld' as it requires HTTP mode.
2014-05-08T10:43:04.325855+00:00 notice: (/Stage[main]/Openstack::Ha::Nova/Openstack::Ha::Haproxy_service[nova-metadata-api]/Exec[haproxy reload for nova-metadata-api]/ret
urns) executed successfully
2014-05-08T10:43:04.326384+00:00 debug: (/Stage[main]/Openstack::Ha::Nova/Openstack::Ha::Haproxy_service[nova-metadata-api]/Exec[haproxy reload for nova-metadata-api]) The
 container Openstack::Ha::Haproxy_service[nova-metadata-api] will propagate my refresh event
2014-05-08T10:43:04.326997+00:00 info: (/Stage[main]/Openstack::Ha::Nova/Openstack::Ha::Haproxy_service[nova-metadata-api]/Exec[haproxy reload for nova-metadata-api]) Eval
uated in 0.07 seconds
2014-05-08T10:43:04.327531+00:00 info: (Openstack::Ha::Haproxy_service[nova-metadata-api]) Starting to evaluate the resource
2014-05-08T10:43:04.335667+00:00 debug: (Openstack::Ha::Haproxy_service[nova-metadata-api]) The container Class[Openstack::Ha::Nova] will propagate my refresh event
2014-05-08T10:43:04.335925+00:00 info: (Openstack::Ha::Haproxy_service[nova-metadata-api]) Evaluated in 0.01 seconds
2014-05-08T10:43:04.336757+00:00 info: (Class[Openstack::Ha::Nova]) Starting to evaluate the resource
2014-05-08T10:43:04.343632+00:00 debug: (Class[Openstack::Ha::Nova]) The container Stage[main] will propagate my refresh event
2014-05-08T10:43:04.343892+00:00 info: (Class[Openstack::Ha::Nova]) Evaluated in 0.01 seconds
2014-05-08T10:43:04.344734+00:00 info: (/Stage[main]/Osnailyfacter::Cluster_ha/Nova_floating_range[10.108.1.128-10.108.1.254]) Starting to evaluate the resource
2014-05-08T10:44:10.765495+00:00 err: (/Stage[main]/Osnailyfacter::Cluster_ha/Nova_floating_range[10.108.1.128-10.108.1.254]) Could not evaluate: Oops - not sure what happened: 757: unexpected token at '<html><body><h1>504 Gateway Time-out</h1>
2014-05-08T10:44:10.765632+00:00 err: (/Stage[main]/Osnailyfacter::Cluster_ha/Nova_floating_range[10.108.1.128-10.108.1.254]) The server didn't respond in time.
2014-05-08T10:44:10.766783+00:00 err: (/Stage[main]/Osnailyfacter::Cluster_ha/Nova_floating_range[10.108.1.128-10.108.1.254]) </body></html>
2014-05-08T10:44:10.766783+00:00 err: (/Stage[main]/Osnailyfacter::Cluster_ha/Nova_floating_range[10.108.1.128-10.108.1.254]) '
2014-05-08T10:44:10.766783+00:00 err: (/Stage[main]/Osnailyfacter::Cluster_ha/Nova_floating_range[10.108.1.128-10.108.1.254]) /usr/lib/ruby/gems/1.8/gems/openstack-1.1.2/lib/openstack/connection.rb:533:in `deal_with_faulty_error'
2014-05-08T10:44:10.766783+00:00 err: (/Stage[main]/Osnailyfacter::Cluster_ha/Nova_floating_range[10.108.1.128-10.108.1.254]) /usr/lib/ruby/gems/1.8/gems/openstack-1.1.2/lib/openstack/connection.rb:510:in `raise_exception'
2014-05-08T10:44:10.766783+00:00 err: (/Stage[main]/Osnailyfacter::Cluster_ha/Nova_floating_range[10.108.1.128-10.108.1.254]) /usr/lib/ruby/gems/1.8/gems/openstack-1.1.2/lib/openstack/connection.rb:207:in `req'
2014-05-08T10:44:10.766783+00:00 err: (/Stage[main]/Osnailyfacter::Cluster_ha/Nova_floating_range[10.108.1.128-10.108.1.254]) /usr/lib/ruby/gems/1.8/gems/openstack-1.1.2/lib/openstack/compute/connection.rb:242:in `api_extensions'
2014-05-08T10:44:10.766783+00:00 err: (/Stage[main]/Osnailyfacter::Cluster_ha/Nova_floating_range[10.108.1.128-10.108.1.254]) /usr/lib/ruby/gems/1.8/gems/openstack-1.1.2/lib/openstack/compute/connection.rb:502:in `check_extension'
2014-05-08T10:44:10.766783+00:00 err: (/Stage[main]/Osnailyfacter::Cluster_ha/Nova_floating_range[10.108.1.128-10.108.1.254]) /usr/lib/ruby/gems/1.8/gems/openstack-1.1.2/l

Revision history for this message
Vladimir Kuklin (vkuklin) wrote :
Revision history for this message
Vladimir Kuklin (vkuklin) wrote :

puppet rerun succeeds. looks like environment performance bug

Mike Scherbakov (mihgen)
Changed in fuel:
assignee: Fuel Library Team (fuel-library) → Fuel Python Team (fuel-python)
Changed in fuel:
assignee: Fuel Python Team (fuel-python) → Alexander Didenko (adidenko)
Changed in fuel:
status: New → In Progress
Revision history for this message
Aleksandr Didenko (adidenko) wrote :

> puppet rerun succeeds. looks like environment performance bug

Was not able to reproduce it yet:

puppet-apply.log:
2014-05-08T10:44:10.765495+00:00 err: (/Stage[main]/Osnailyfacter::Cluster_ha/Nova_floating_range[10.108.1.128-10.108.1.254]) Could not evaluate: Oops - not sure what happened: 757: unexpected token at '<html><body><h1>504 Gateway Time-out</h1>

nova-api.log:
2014-05-08 10:44:10.836 2079 DEBUG urllib3.connectionpool [-] "POST /v2.0/tokens HTTP/1.1" 504 None _make_request /usr/lib/python2.6/site-packages/urllib3/connectionpool.py:330

Around 10:44:10 time only one controller node was deploying, so this means we've got "504 Gateway Time-out" from haproxy+keystone on node-1. Keystone was already running on this node:

2014-05-08T10:40:31.977063+00:00 debug: 2014-05-08 08:18:31.919 29269 INFO eventlet.wsgi.server [-] (29269) wsgi starting up on http://10.108.2.3:5000/

And there are no errors in keystone log for 10:44:10. So it could really be the environment performance issue when haproxy did not manage to connect to keystone backend in time.

Changed in fuel:
status: In Progress → Invalid
To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.