Mismatch on dhcp_agents_per_network when overcloud is deployed on Ocata VS Ocata upgraded from Newton

Bug #1752826 reported by David Manchado
6
This bug affects 1 person
Affects Status Importance Assigned to Milestone
tripleo
Won't Fix
Medium
Unassigned

Bug Description

Description
===========
We have noticed that in our Production environment (3 controllers) dhcp_agents_per_network is set to 1 while in our Staging environment (3 controllers too) dhcp_agents_per_network is set to 3.
The only difference is that Production has been upgraded from Newton while Staging has been deployed in Ocata.
Note we do not have any custom config to set it so we might be expecting have a similar status on the two environments.

According to [1] dhcp_agents_per_network should be set to the number of agents.
Note this patch was merged (Jul 2017) after the Production environment was deployed (Mar 2017) but the code included in the review is already applied.

I'm filing this LP as result of an internal discussion to determine if this is the expected and/or desired behaviour.

[1] https://review.openstack.org/#/c/479970

Steps to reproduce
==================
1 - Deploy RDO Newton on a 3 controller environment (probably deploying atm might end up with dhcp_agents_per_network = 3 because the patch is already included)
2 - Upgrade to Ocata
3 - Deploy a second cloud RDO ocata
4 - Compare dhcp_agents_per_network on /etc/neutron/neutron.conf values on the two environments.

Expected result
===============
dhcp_agents_per_network = 3 on the two environments

Actual result
=============
On an upgraded environment:
dhcp_agents_per_network = 1

On a fresh new deployed environment:
dhcp_agents_per_network = 3

Environment
===========
1. Exact version of OpenStack you are running. See the following
Ocata. Note tripleo and neutron related packages.

openstack-tripleo-heat-templates-6.2.8-0.20180103204446.4c002bd.el7.centos.noarch
openstack-tripleo-common-6.1.4-1.el7.noarch
openstack-neutron-common-10.0.5-0.20180105192920.295c700.el7.centos.noarch
python2-neutronclient-6.1.1-1.el7.noarch
python-neutron-lib-1.1.0-1.el7.noarch
puppet-tripleo-6.5.7-0.20180105181825.1285b77.el7.centos.noarch
python-neutron-10.0.5-0.20180105192920.295c700.el7.centos.noarch
openstack-neutron-ml2-10.0.5-0.20180105192920.295c700.el7.centos.noarch
openstack-tripleo-ui-3.2.3-0.20180103092401.1680110.el7.centos.noarch
openstack-tripleo-image-elements-6.1.2-1.el7.noarch
openstack-tripleo-0.0.8-0.3.4de13b3git.el7.noarch
openstack-neutron-10.0.5-0.20180105192920.295c700.el7.centos.noarch
puppet-neutron-10.3.2-0.20180103174737.2e7d298.el7.centos.noarch
openstack-tripleo-validations-5.6.3-0.20171223073242.97ae121.el7.centos.noarch
openstack-tripleo-puppet-elements-6.2.4-1.el7.noarch
python-tripleoclient-6.2.3-1.el7.noarch
openstack-neutron-openvswitch-10.0.5-0.20180105192920.295c700.el7.centos.noarch

2. Which storage type did you use?
Not relevant

3. Which networking type did you use?
Neutron with OpenVSwitch

Logs & Configs
==============
PROD CONTROLLER
hiera tripleo::profile::base::neutron::dhcp_agents_per_network returns 1

STAGING CONTROLLER
hiera tripleo::profile::base::neutron::dhcp_agents_per_network returns nil

Changed in tripleo:
milestone: none → rocky-1
importance: Undecided → Medium
status: New → Triaged
Changed in tripleo:
milestone: rocky-1 → rocky-2
Changed in tripleo:
milestone: rocky-2 → rocky-3
Changed in tripleo:
milestone: rocky-3 → rocky-rc1
Changed in tripleo:
milestone: rocky-rc1 → stein-1
Changed in tripleo:
milestone: stein-1 → stein-2
Changed in tripleo:
milestone: stein-2 → stein-3
Changed in tripleo:
milestone: stein-3 → train-1
Changed in tripleo:
milestone: train-1 → train-2
Revision history for this message
Alex Schultz (alex-schultz) wrote :

Ocata is EOL, though this was likely due to this THT change -> https://review.opendev.org/#/c/442024/1/puppet/services/neutron-base.yaml

Changed in tripleo:
status: Triaged → Won't Fix
To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.