Octavia Create in 2023.2 Throws Exception during message handling: AttributeError: 'NoneType' object has no attribute 'upper'

Bug #2066254 reported by Aldin Setiawan
6
This bug affects 1 person
Affects Status Importance Assigned to Milestone
kolla-ansible
New
Undecided
Unassigned

Bug Description

**Bug Report**

What happened:
Deploying a new 2023.2 environment with Octavia, I am unable to deploy a minimal load balancer. When deploying via the UI, an Unexpected Error is seen and in the log, the following is found:
2024-05-21 16:05:04.778 731 INFO octavia.controller.queue.v2.endpoints [-] Deleting load balancer '04dabdbb-96cb-44f5-971e-024b3e6f4c7f'...
2024-05-21 16:05:05.152 731 WARNING octavia.network.drivers.neutron.allowed_address_pairs [-] Can't deallocate VIP because the vip port c170f6a6-722d-4386-bef8-a649703c405d cannot be found in neutron. Continuing cleanup.: octavia.network.base.PortNotFound: port not found (port id: c170f6a6-722d-4386-bef8-a649703c405d).
2024-05-21 16:05:05.205 731 INFO octavia.network.drivers.neutron.allowed_address_pairs [-] Removing security group 40218bcc-0d4c-425e-af28-c39528fa0ae8 from port c170f6a6-722d-4386-bef8-a649703c405d
2024-05-21 16:05:05.321 731 INFO octavia.network.drivers.neutron.allowed_address_pairs [-] Deleted security group 40218bcc-0d4c-425e-af28-c39528fa0ae8
2024-05-21 16:24:30.207 7 INFO octavia.common.config [-] Logging enabled!
2024-05-21 16:24:30.208 7 INFO octavia.common.config [-] /var/lib/kolla/venv/bin/octavia-worker version 11.0.4.dev1
2024-05-21 16:24:30.275 1009 INFO octavia.controller.queue.v1.consumer [-] Starting consumer...
2024-05-21 16:24:30.289 1009 WARNING octavia.controller.worker.v1.controller_worker [-] The 'amphorav1' provider is deprecated and will be removed in a future release. Use the 'amphora' driver instead.
2024-05-21 16:24:30.288 1012 INFO octavia.controller.queue.v2.consumer [-] Starting V2 consumer...
2024-05-21 16:26:05.845 1012 INFO octavia.controller.queue.v2.endpoints [-] Creating load balancer '5f023425-899d-4d2e-b8a6-231737392ef0'...
2024-05-21 16:26:06.933 1012 INFO octavia.network.drivers.neutron.allowed_address_pairs [-] Port 6e8c95d9-a3b2-46cb-8ce2-2b9eb73bad56 already exists. Nothing to be done.
2024-05-21 16:26:06.934 1012 INFO octavia.controller.worker.v2.tasks.network_tasks [-] Allocated vip with port id 6e8c95d9-a3b2-46cb-8ce2-2b9eb73bad56, subnet id 9a773df2-c458-424c-8c87-37c5f6918519, ip address 10.10.10.220 for load balancer 5f023425-899d-4d2e-b8a6-231737392ef0
2024-05-21 16:26:06.971 1012 INFO octavia.controller.worker.v2.tasks.database_tasks [-] Updated vip with port id 6e8c95d9-a3b2-46cb-8ce2-2b9eb73bad56, subnet id 9a773df2-c458-424c-8c87-37c5f6918519, ip address 10.10.10.220 for load balancer 5f023425-899d-4d2e-b8a6-231737392ef0
2024-05-21 16:26:07.241 1012 WARNING octavia.controller.worker.v2.controller_worker [-] Task 'octavia.controller.worker.v2.tasks.network_tasks.UpdateVIPSecurityGroup' (0e5de3e4-84e9-4411-8186-e91a7f0aea17) transitioned into state 'FAILURE' from state 'RUNNING'
6 predecessors (most recent first):
  Atom 'octavia.controller.worker.v2.tasks.database_tasks.UpdateAdditionalVIPsAfterAllocation' {'intention': 'EXECUTE', 'state': 'SUCCESS', 'requires': {'loadbalancer_id': '5f023425-899d-4d2e-b8a6-231737392ef0', 'additional_vips': []}, 'provides': {'admin_state_up': True, 'description': None, 'loadbalancer_id': '5f023425-899d-4d2e-b8a6-231737392ef0', 'name': None, 'project_id': '43369cf8503c42b497dde9e5cf8fdda6', 'vip_address': '10.10.10.220', 'vip_network_id': '06697f3c-1ed5
-44c6-9dd6-f7d3c85e50d4', 'vip_port_id': '6e8c95d9-a3b2-46cb-8ce2-2b9eb73bad56', 'vip_subnet_id': '9a773df2-c458-424c-8c87-37c5f6918519', 'vip_qos_policy_id': None, 'availability_zone': None}}
  |__Atom 'octavia.controller.worker.v2.tasks.database_tasks.UpdateVIPAfterAllocation' {'intention': 'EXECUTE', 'state': 'SUCCESS', 'requires': {'loadbalancer_id': '5f023425-899d-4d2e-b8a6-231737392ef0', 'vip': {'load_balancer_id': '5f023425-899d-4d2e-b8a6-231737392ef0', 'ip_address': '10.10.10.220', 'subnet_id': '9a773df2-c458-424c-8c87-37c5f6918519', 'network_id': '06697f3c-1ed5-44c6-9dd6-f7d3c85e50d4', 'port_id': '6e8c95d9-a3b2-46cb-8ce2-2b9eb73bad56', 'load_balancer': Non
e, 'qos_policy_id': None, 'octavia_owned': False}}, 'provides': {'admin_state_up': True, 'description': None, 'loadbalancer_id': '5f023425-899d-4d2e-b8a6-231737392ef0', 'name': None, 'project_id': '43369cf8503c42b497dde9e5cf8fdda6', 'vip_address': '10.10.10.220', 'vip_network_id': '06697f3c-1ed5-44c6-9dd6-f7d3c85e50d4', 'vip_port_id': '6e8c95d9-a3b2-46cb-8ce2-2b9eb73bad56', 'vip_subnet_id': '9a773df2-c458-424c-8c87-37c5f6918519', 'vip_qos_policy_id': None, 'availability_zone'
: None}}
     |__Atom 'octavia.controller.worker.v2.tasks.network_tasks.AllocateVIP' {'intention': 'EXECUTE', 'state': 'SUCCESS', 'requires': {'loadbalancer': {'admin_state_up': True, 'description': None, 'loadbalancer_id': '5f023425-899d-4d2e-b8a6-231737392ef0', 'name': None, 'project_id': '43369cf8503c42b497dde9e5cf8fdda6', 'vip_address': '10.10.10.220', 'vip_network_id': '06697f3c-1ed5-44c6-9dd6-f7d3c85e50d4', 'vip_port_id': '6e8c95d9-a3b2-46cb-8ce2-2b9eb73bad56', 'vip_subnet_id':
'9a773df2-c458-424c-8c87-37c5f6918519', 'vip_qos_policy_id': None, 'availability_zone': None}}, 'provides': ({'load_balancer_id': '5f023425-899d-4d2e-b8a6-231737392ef0', 'ip_address': '10.10.10.220', 'subnet_id': '9a773df2-c458-424c-8c87-37c5f6918519', 'network_id': '06697f3c-1ed5-44c6-9dd6-f7d3c85e50d4', 'port_id': '6e8c95d9-a3b2-46cb-8ce2-2b9eb73bad56', 'load_balancer': None, 'qos_policy_id': None, 'octavia_owned': False}, [])}
        |__Atom 'reload-lb-before-allocate-vip' {'intention': 'EXECUTE', 'state': 'SUCCESS', 'requires': {'loadbalancer_id': '5f023425-899d-4d2e-b8a6-231737392ef0'}, 'provides': {'admin_state_up': True, 'description': None, 'loadbalancer_id': '5f023425-899d-4d2e-b8a6-231737392ef0', 'name': None, 'project_id': '43369cf8503c42b497dde9e5cf8fdda6', 'vip_address': '10.10.10.220', 'vip_network_id': '06697f3c-1ed5-44c6-9dd6-f7d3c85e50d4', 'vip_port_id': '6e8c95d9-a3b2-46cb-8ce2-2b9e
b73bad56', 'vip_subnet_id': '9a773df2-c458-424c-8c87-37c5f6918519', 'vip_qos_policy_id': None, 'availability_zone': None}}
           |__Atom 'octavia.controller.worker.v2.tasks.lifecycle_tasks.LoadBalancerIDToErrorOnRevertTask' {'intention': 'EXECUTE', 'state': 'SUCCESS', 'requires': {'loadbalancer_id': '5f023425-899d-4d2e-b8a6-231737392ef0'}, 'provides': None}
              |__Flow 'octavia-create-loadbalancer-flow': AttributeError: 'NoneType' object has no attribute 'upper'
2024-05-21 16:26:07.241 1012 ERROR octavia.controller.worker.v2.controller_worker Traceback (most recent call last):
2024-05-21 16:26:07.241 1012 ERROR octavia.controller.worker.v2.controller_worker File "/var/lib/kolla/venv/lib/python3.10/site-packages/taskflow/engines/action_engine/executor.py", line 52, in _execute_task
2024-05-21 16:26:07.241 1012 ERROR octavia.controller.worker.v2.controller_worker result = task.execute(**arguments)
2024-05-21 16:26:07.241 1012 ERROR octavia.controller.worker.v2.controller_worker File "/var/lib/kolla/venv/lib/python3.10/site-packages/octavia/controller/worker/v2/tasks/network_tasks.py", line 527, in execute
2024-05-21 16:26:07.241 1012 ERROR octavia.controller.worker.v2.controller_worker sg_id = self.network_driver.update_vip_sg(db_lb, db_lb.vip)
2024-05-21 16:26:07.241 1012 ERROR octavia.controller.worker.v2.controller_worker File "/var/lib/kolla/venv/lib/python3.10/site-packages/octavia/network/drivers/neutron/allowed_address_pairs.py", line 405, in update_vip_sg
2024-05-21 16:26:07.241 1012 ERROR octavia.controller.worker.v2.controller_worker self._update_security_group_rules(load_balancer,
2024-05-21 16:26:07.241 1012 ERROR octavia.controller.worker.v2.controller_worker File "/var/lib/kolla/venv/lib/python3.10/site-packages/octavia/network/drivers/neutron/allowed_address_pairs.py", line 199, in _update_security_group_rules
2024-05-21 16:26:07.241 1012 ERROR octavia.controller.worker.v2.controller_worker rule.get('protocol').upper() not in
2024-05-21 16:26:07.241 1012 ERROR octavia.controller.worker.v2.controller_worker AttributeError: 'NoneType' object has no attribute 'upper'
2024-05-21 16:26:07.241 1012 ERROR octavia.controller.worker.v2.controller_worker
2024-05-21 16:26:07.250 1012 WARNING octavia.controller.worker.v2.controller_worker [-] Task 'octavia.controller.worker.v2.tasks.network_tasks.UpdateVIPSecurityGroup' (0e5de3e4-84e9-4411-8186-e91a7f0aea17) transitioned into state 'REVERTED' from state 'REVERTING'
2024-05-21 16:26:07.252 1012 WARNING octavia.controller.worker.v2.controller_worker [-] Task 'octavia.controller.worker.v2.tasks.database_tasks.UpdateAdditionalVIPsAfterAllocation' (3cb850ab-227a-4eb1-a067-ff52b59c089d) transitioned into state 'REVERTED' from state 'REVERTING'
2024-05-21 16:26:07.255 1012 WARNING octavia.controller.worker.v2.controller_worker [-] Task 'octavia.controller.worker.v2.tasks.database_tasks.UpdateVIPAfterAllocation' (8dadbb17-1b4f-4132-80d6-cf8498176ec0) transitioned into state 'REVERTED' from state 'REVERTING'
2024-05-21 16:26:07.257 1012 WARNING octavia.controller.worker.v2.tasks.network_tasks [-] Deallocating vip 10.10.10.220
2024-05-21 16:26:07.257 1012 ERROR octavia.controller.worker.v2.tasks.network_tasks [-] Failed to deallocate VIP. Resources may still be in use from vip: 10.10.10.220 due to error: 'NoneType' object has no attribute 'amphorae'
2024-05-21 16:26:07.258 1012 WARNING octavia.controller.worker.v2.controller_worker [-] Task 'octavia.controller.worker.v2.tasks.network_tasks.AllocateVIP' (ca201b38-4213-4e23-8c50-77e627a80f22) transitioned into state 'REVERTED' from state 'REVERTING'
2024-05-21 16:26:07.261 1012 WARNING octavia.controller.worker.v2.controller_worker [-] Task 'reload-lb-before-allocate-vip' (783e6951-2253-4ae6-bb9f-daa538c398ad) transitioned into state 'REVERTED' from state 'REVERTING'
2024-05-21 16:26:07.270 1012 WARNING octavia.controller.worker.v2.controller_worker [-] Task 'octavia.controller.worker.v2.tasks.lifecycle_tasks.LoadBalancerIDToErrorOnRevertTask' (e23bd076-6b5d-4007-8f12-883fb5360708) transitioned into state 'REVERTED' from state 'REVERTING'
2024-05-21 16:26:07.275 1012 WARNING octavia.controller.worker.v2.controller_worker [-] Flow 'octavia-create-loadbalancer-flow' (70f3dc8d-0c08-45c8-b11b-5661b95b58f2) transitioned into state 'REVERTED' from state 'RUNNING'
2024-05-21 16:26:07.276 1012 ERROR oslo_messaging.rpc.server [-] Exception during message handling: AttributeError: 'NoneType' object has no attribute 'upper'
2024-05-21 16:26:07.276 1012 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
2024-05-21 16:26:07.276 1012 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming
2024-05-21 16:26:07.276 1012 ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message)
2024-05-21 16:26:07.276 1012 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch
2024-05-21 16:26:07.276 1012 ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args)
2024-05-21 16:26:07.276 1012 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch
2024-05-21 16:26:07.276 1012 ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args)
2024-05-21 16:26:07.276 1012 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.10/site-packages/octavia/controller/queue/v2/endpoints.py", line 43, in create_load_balancer
2024-05-21 16:26:07.276 1012 ERROR oslo_messaging.rpc.server self.worker.create_load_balancer(loadbalancer, flavor,
2024-05-21 16:26:07.276 1012 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.10/site-packages/tenacity/__init__.py", line 333, in wrapped_f
2024-05-21 16:26:07.276 1012 ERROR oslo_messaging.rpc.server return self(f, *args, **kw)
2024-05-21 16:26:07.276 1012 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.10/site-packages/tenacity/__init__.py", line 423, in __call__
2024-05-21 16:26:07.276 1012 ERROR oslo_messaging.rpc.server do = self.iter(retry_state=retry_state)
2024-05-21 16:26:07.276 1012 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.10/site-packages/tenacity/__init__.py", line 360, in iter
2024-05-21 16:26:07.276 1012 ERROR oslo_messaging.rpc.server return fut.result()
2024-05-21 16:26:07.276 1012 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.10/concurrent/futures/_base.py", line 451, in result
2024-05-21 16:26:07.276 1012 ERROR oslo_messaging.rpc.server return self.__get_result()
2024-05-21 16:26:07.276 1012 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result
2024-05-21 16:26:07.276 1012 ERROR oslo_messaging.rpc.server raise self._exception
2024-05-21 16:26:07.276 1012 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.10/site-packages/tenacity/__init__.py", line 426, in __call__
2024-05-21 16:26:07.276 1012 ERROR oslo_messaging.rpc.server result = fn(*args, **kwargs)
2024-05-21 16:26:07.276 1012 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.10/site-packages/octavia/controller/worker/v2/controller_worker.py", line 364, in create_load_balancer
2024-05-21 16:26:07.276 1012 ERROR oslo_messaging.rpc.server self.run_flow(
2024-05-21 16:26:07.276 1012 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.10/site-packages/octavia/controller/worker/v2/controller_worker.py", line 111, in run_flow
2024-05-21 16:26:07.276 1012 ERROR oslo_messaging.rpc.server tf.run()
2024-05-21 16:26:07.276 1012 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.10/site-packages/taskflow/engines/action_engine/engine.py", line 247, in run
2024-05-21 16:26:07.276 1012 ERROR oslo_messaging.rpc.server for _state in self.run_iter(timeout=timeout):
2024-05-21 16:26:07.276 1012 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.10/site-packages/taskflow/engines/action_engine/engine.py", line 340, in run_iter
2024-05-21 16:26:07.276 1012 ERROR oslo_messaging.rpc.server failure.Failure.reraise_if_any(er_failures)
2024-05-21 16:26:07.276 1012 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.10/site-packages/taskflow/types/failure.py", line 338, in reraise_if_any
2024-05-21 16:26:07.276 1012 ERROR oslo_messaging.rpc.server failures[0].reraise()
2024-05-21 16:26:07.276 1012 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.10/site-packages/taskflow/types/failure.py", line 350, in reraise
2024-05-21 16:26:07.276 1012 ERROR oslo_messaging.rpc.server raise value
2024-05-21 16:26:07.276 1012 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.10/site-packages/taskflow/engines/action_engine/executor.py", line 52, in _execute_task
2024-05-21 16:26:07.276 1012 ERROR oslo_messaging.rpc.server result = task.execute(**arguments)
2024-05-21 16:26:07.276 1012 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.10/site-packages/octavia/controller/worker/v2/tasks/network_tasks.py", line 527, in execute
2024-05-21 16:26:07.276 1012 ERROR oslo_messaging.rpc.server sg_id = self.network_driver.update_vip_sg(db_lb, db_lb.vip)
2024-05-21 16:26:07.276 1012 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.10/site-packages/octavia/network/drivers/neutron/allowed_address_pairs.py", line 405, in update_vip_sg
2024-05-21 16:26:07.276 1012 ERROR oslo_messaging.rpc.server self._update_security_group_rules(load_balancer,
2024-05-21 16:26:07.276 1012 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.10/site-packages/octavia/network/drivers/neutron/allowed_address_pairs.py", line 199, in _update_security_group_rules
2024-05-21 16:26:07.276 1012 ERROR oslo_messaging.rpc.server rule.get('protocol').upper() not in
2024-05-21 16:26:07.276 1012 ERROR oslo_messaging.rpc.server AttributeError: 'NoneType' object has no attribute 'upper'
2024-05-21 16:26:07.276 1012 ERROR oslo_messaging.rpc.server

I deploying minimal loadbalancer in openstack 2023.2 but i facing bug with ssl and trying to solve from this workload https://bugs.launchpad.net/kolla-ansible/+bug/2046382 and now i'm getting this error

How to reproduce it (minimal and precise):

- Deploy the base Kolla-Ansible and add in Octavia (globals.yml is below)
- Select the admin project if needed
- Navigate to Project > Network > Load Balancers
- Create a new load balancer

**Environment**:
root@jk1oscmp01:/etc/kolla# cat /etc/os-release
PRETTY_NAME="Ubuntu 22.04.4 LTS"
NAME="Ubuntu"
VERSION_ID="22.04"
VERSION="22.04.4 LTS (Jammy Jellyfish)"
VERSION_CODENAME=jammy
ID=ubuntu
ID_LIKE=debian
HOME_URL="https://www.ubuntu.com/"

---
---
kolla_base_distro: "ubuntu"
kolla_install_type: "source"
openstack_release: "2023.2"
nova_compute_virt_type: "kvm"
openstack_region_name: "Jkt3"
horizon_keystone_multidomain: true

## tls
kolla_enable_tls_internal: "yes"
kolla_enable_tls_external: "yes"
kolla_copy_ca_into_containers: "yes"
kolla_verify_tls_backend: "no"
libvirt_tls: "no"
libvirt_enable_sasl: "false"
openstack_cacert: "/etc/ssl/certs/ca-certificates.crt"
## openstack service
enable_openstack_core: "yes"
enable_cinder: "yes"
enable_fluentd: "no"
enable_barbican: "yes"
enable_cinder: "yes"
enable_haproxy: "yes"
enable_octavia: "yes"
barbican_crypto_plugin: "simple_crypto"
barbican_library_path: "/usr/lib/libCryptoki2_64.so"
#enable_magnum: "yes"
#enable_redis: "yes"
enable_octavia_jobboard: "no"
octavia_amp_flavor:
  name: "amphora"
  is_public: "no"
  vcpus: 1
  ram: 1024
  disk: 5

octavia_amp_network:
  name: lb-mgmt-net
  external: "yes"
  shared: "no"
  mtu: 1500
  provider_physical_network: "vlan-52"
  subnet:
    name: lb-mgmt-subnet
    cidr: "{{ octavia_amp_network_cidr }}"
    enable_dhcp: "yes"
    gateway_ip: "172.18.55.254"
    allocation_pool_start: "172.18.52.20"
    allocation_pool_end: "172.18.55.253"

# Octavia management network subnet CIDR.
octavia_amp_network_cidr: 172.18.52.0/22
octavia_loadbalancer_topology: "ACTIVE_STANDBY"

Tags: octavia
Revision history for this message
Aldin Setiawan (just-humanz403) wrote :
description: updated
To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.