When I create a pool with --session-persistence type=SOURCE_IP , error occurs in octavia-worker.log. APP_COOKIE has the same problem,but HTTP_COOKIE is OK.
My amphora image is based on CentOS7.
reproduce steps:
1.neutron lbaas-loadbalancer-create --name sp-test subnet
2.neutron lbaas-listener-create --name sp-test --loadbalancer sp-test --protocol HTTP --protocol-port 80
3.neutron lbaas-pool-create --name sp-test --lb-algorithm ROUND_ROBIN --listener sp-test --protocol HTTP --session-persistence type=SOURCE_IP
octavia-worker.log
---
2017-04-07 16:45:42.446 9922 INFO octavia.controller.queue.endpoint [-] Creating pool 'acb392b2-6c3d-465b-b302-1af2aae84c95'...
2017-04-07 16:45:49.534 9922 WARNING octavia.controller.worker.controller_worker [-] Task 'octavia.controller.worker.tasks.amphora_driver_tasks.ListenersUpdate' (7ed2654f-6279-40df-8eea-c927e46063bd) transitioned into state 'FAILURE' from state 'RUNNING'
3 predecessors (most recent first):
Atom 'octavia.controller.worker.tasks.database_tasks.MarkPoolPendingCreateInDB' {'intention': 'EXECUTE', 'state': 'SUCCESS', 'requires': {'pool': <octavia.common.data_models.Pool object at 0x48af8d0>}, 'provides': None}
|__Atom 'octavia.controller.worker.tasks.lifecycle_tasks.PoolToErrorOnRevertTask' {'intention': 'EXECUTE', 'state': 'SUCCESS', 'requires': {'listeners': [<octavia.common.data_models.Listener object at 0x51ae990>], 'loadbalancer': <octavia.common.data_models.LoadBalancer object at 0x51adad0>, 'pool': <octavia.common.data_models.Pool object at 0x48af8d0>}, 'provides': None}
|__Flow 'octavia-create-pool-flow'
2017-04-07 16:45:49.534 9922 ERROR octavia.controller.worker.controller_worker Traceback (most recent call last):
2017-04-07 16:45:49.534 9922 ERROR octavia.controller.worker.controller_worker File "/opt/openstack/octavia/octavia-env/lib/python2.7/site-packages/taskflow/engines/action_engine/executor.py", line 53, in _execute_task
2017-04-07 16:45:49.534 9922 ERROR octavia.controller.worker.controller_worker result = task.execute(**arguments)
2017-04-07 16:45:49.534 9922 ERROR octavia.controller.worker.controller_worker File "/opt/openstack/octavia/octavia-env/lib/python2.7/site-packages/octavia/controller/worker/tasks/amphora_driver_tasks.py", line 57, in execute
2017-04-07 16:45:49.534 9922 ERROR octavia.controller.worker.controller_worker self.amphora_driver.update(listener, loadbalancer.vip)
2017-04-07 16:45:49.534 9922 ERROR octavia.controller.worker.controller_worker File "/opt/openstack/octavia/octavia-env/lib/python2.7/site-packages/octavia/amphorae/drivers/haproxy/rest_api_driver.py", line 79, in update
2017-04-07 16:45:49.534 9922 ERROR octavia.controller.worker.controller_worker self.client.reload_listener(amp, listener.id)
2017-04-07 16:45:49.534 9922 ERROR octavia.controller.worker.controller_worker File "/opt/openstack/octavia/octavia-env/lib/python2.7/site-packages/octavia/amphorae/drivers/haproxy/rest_api_driver.py", line 319, in _action
2017-04-07 16:45:49.534 9922 ERROR octavia.controller.worker.controller_worker return exc.check_exception(r)
2017-04-07 16:45:49.534 9922 ERROR octavia.controller.worker.controller_worker File "/opt/openstack/octavia/octavia-env/lib/python2.7/site-packages/octavia/amphorae/drivers/haproxy/exceptions.py", line 32, in check_exception
2017-04-07 16:45:49.534 9922 ERROR octavia.controller.worker.controller_worker raise responses[status_code]()
2017-04-07 16:45:49.534 9922 ERROR octavia.controller.worker.controller_worker InternalServerError: Internal Server Error
2017-04-07 16:45:49.534 9922 ERROR octavia.controller.worker.controller_worker
2017-04-07 16:45:49.539 9922 WARNING octavia.controller.worker.tasks.amphora_driver_tasks [-] Reverting listeners updates.
2017-04-07 16:45:49.551 9922 WARNING octavia.controller.worker.controller_worker [-] Task 'octavia.controller.worker.tasks.amphora_driver_tasks.ListenersUpdate' (7ed2654f-6279-40df-8eea-c927e46063bd) transitioned into state 'REVERTED' from state 'REVERTING'
2017-04-07 16:45:49.554 9922 WARNING octavia.controller.worker.tasks.database_tasks [-] Reverting mark pool pending create in DB for pool id acb392b2-6c3d-465b-b302-1af2aae84c95
2017-04-07 16:45:49.562 9922 WARNING octavia.controller.worker.controller_worker [-] Task 'octavia.controller.worker.tasks.database_tasks.MarkPoolPendingCreateInDB' (2cda4e69-84d2-4cfe-af2e-f60762d82169) transitioned into state 'REVERTED' from state 'REVERTING'
2017-04-07 16:45:49.583 9922 WARNING octavia.controller.worker.controller_worker [-] Task 'octavia.controller.worker.tasks.lifecycle_tasks.PoolToErrorOnRevertTask' (ec879330-3e6b-46ea-99bd-3848a84d71fc) transitioned into state 'REVERTED' from state 'REVERTING'
2017-04-07 16:45:49.585 9922 WARNING octavia.controller.worker.controller_worker [-] Flow 'octavia-create-pool-flow' (72d4b77f-4541-46f8-b729-f7df90b07a97) transitioned into state 'REVERTED' from state 'RUNNING'
2017-04-07 16:45:49.586 9922 ERROR oslo_messaging.rpc.server [-] Exception during message handling
2017-04-07 16:45:49.586 9922 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
2017-04-07 16:45:49.586 9922 ERROR oslo_messaging.rpc.server File "/opt/openstack/octavia/octavia-env/lib/python2.7/site-packages/oslo_messaging/rpc/server.py", line 157, in _process_incoming
2017-04-07 16:45:49.586 9922 ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message)
2017-04-07 16:45:49.586 9922 ERROR oslo_messaging.rpc.server File "/opt/openstack/octavia/octavia-env/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 213, in dispatch
2017-04-07 16:45:49.586 9922 ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args)
2017-04-07 16:45:49.586 9922 ERROR oslo_messaging.rpc.server File "/opt/openstack/octavia/octavia-env/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 183, in _do_dispatch
2017-04-07 16:45:49.586 9922 ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args)
2017-04-07 16:45:49.586 9922 ERROR oslo_messaging.rpc.server File "/opt/openstack/octavia/octavia-env/lib/python2.7/site-packages/octavia/controller/queue/endpoint.py", line 71, in create_pool
2017-04-07 16:45:49.586 9922 ERROR oslo_messaging.rpc.server self.worker.create_pool(pool_id)
2017-04-07 16:45:49.586 9922 ERROR oslo_messaging.rpc.server File "/opt/openstack/octavia/octavia-env/lib/python2.7/site-packages/octavia/controller/worker/controller_worker.py", line 423, in create_pool
2017-04-07 16:45:49.586 9922 ERROR oslo_messaging.rpc.server create_pool_tf.run()
2017-04-07 16:45:49.586 9922 ERROR oslo_messaging.rpc.server File "/opt/openstack/octavia/octavia-env/lib/python2.7/site-packages/taskflow/engines/action_engine/engine.py", line 247, in run
2017-04-07 16:45:49.586 9922 ERROR oslo_messaging.rpc.server for _state in self.run_iter(timeout=timeout):
2017-04-07 16:45:49.586 9922 ERROR oslo_messaging.rpc.server File "/opt/openstack/octavia/octavia-env/lib/python2.7/site-packages/taskflow/engines/action_engine/engine.py", line 340, in run_iter
2017-04-07 16:45:49.586 9922 ERROR oslo_messaging.rpc.server failure.Failure.reraise_if_any(er_failures)
2017-04-07 16:45:49.586 9922 ERROR oslo_messaging.rpc.server File "/opt/openstack/octavia/octavia-env/lib/python2.7/site-packages/taskflow/types/failure.py", line 336, in reraise_if_any
2017-04-07 16:45:49.586 9922 ERROR oslo_messaging.rpc.server failures[0].reraise()
2017-04-07 16:45:49.586 9922 ERROR oslo_messaging.rpc.server File "/opt/openstack/octavia/octavia-env/lib/python2.7/site-packages/taskflow/types/failure.py", line 343, in reraise
2017-04-07 16:45:49.586 9922 ERROR oslo_messaging.rpc.server six.reraise(*self._exc_info)
2017-04-07 16:45:49.586 9922 ERROR oslo_messaging.rpc.server File "/opt/openstack/octavia/octavia-env/lib/python2.7/site-packages/taskflow/engines/action_engine/executor.py", line 53, in _execute_task
2017-04-07 16:45:49.586 9922 ERROR oslo_messaging.rpc.server result = task.execute(**arguments)
2017-04-07 16:45:49.586 9922 ERROR oslo_messaging.rpc.server File "/opt/openstack/octavia/octavia-env/lib/python2.7/site-packages/octavia/controller/worker/tasks/amphora_driver_tasks.py", line 57, in execute
2017-04-07 16:45:49.586 9922 ERROR oslo_messaging.rpc.server self.amphora_driver.update(listener, loadbalancer.vip)
2017-04-07 16:45:49.586 9922 ERROR oslo_messaging.rpc.server File "/opt/openstack/octavia/octavia-env/lib/python2.7/site-packages/octavia/amphorae/drivers/haproxy/rest_api_driver.py", line 79, in update
2017-04-07 16:45:49.586 9922 ERROR oslo_messaging.rpc.server self.client.reload_listener(amp, listener.id)
2017-04-07 16:45:49.586 9922 ERROR oslo_messaging.rpc.server File "/opt/openstack/octavia/octavia-env/lib/python2.7/site-packages/octavia/amphorae/drivers/haproxy/rest_api_driver.py", line 319, in _action
2017-04-07 16:45:49.586 9922 ERROR oslo_messaging.rpc.server return exc.check_exception(r)
2017-04-07 16:45:49.586 9922 ERROR oslo_messaging.rpc.server File "/opt/openstack/octavia/octavia-env/lib/python2.7/site-packages/octavia/amphorae/drivers/haproxy/exceptions.py", line 32, in check_exception
2017-04-07 16:45:49.586 9922 ERROR oslo_messaging.rpc.server raise responses[status_code]()
2017-04-07 16:45:49.586 9922 ERROR oslo_messaging.rpc.server InternalServerError: Internal Server Error
2017-04-07 16:45:49.586 9922 ERROR oslo_messaging.rpc.server
----
amphora-agent.log
----
[2017-04-07 08:45:42 +0000] [23889] [DEBUG] PUT /0.5/listeners/5990279c-52c7-460a-81e2-e65ace1b3dfa/8dcf9ec0-ec47-4068-a4fa-b4a117f14c65/haproxy
::ffff:192.168.30.183 - - [07/Apr/2017:08:45:45 +0000] "PUT /0.5/listeners/5990279c-52c7-460a-81e2-e65ace1b3dfa/8dcf9ec0-ec47-4068-a4fa-b4a117f14c65/haproxy HTTP/1.1" 202 22 "-" "Octavia HaProxy Rest Client/0.5 (https://wiki.openstack.org/wiki/Octavia)"
[2017-04-07 08:45:45 +0000] [23889] [DEBUG] PUT /0.5/listeners/8dcf9ec0-ec47-4068-a4fa-b4a117f14c65/reload
::ffff:192.168.30.183 - - [07/Apr/2017:08:45:48 +0000] "PUT /0.5/listeners/8dcf9ec0-ec47-4068-a4fa-b4a117f14c65/reload HTTP/1.1" 500 388 "-" "Octavia HaProxy Rest Client/0.5 (https://wiki.openstack.org/wiki/Octavia)"
----
amphora /var/log/messagens
----
Apr 7 08:45:43 host-192-168-31-217 systemd: Reloading.
Apr 7 08:45:46 host-192-168-31-217 Keepalived_vrrp[3756]: VRRP_Instance(3217500556934cce9d2b02317450744d) Sending gratuitous ARPs on eth1 for 192.100.100.22
Apr 7 08:45:48 host-192-168-31-217 systemd: haproxy-8dcf9ec0-ec47-4068-a4fa-b4a117f14c65.service: control process exited, code=exited status=1
Apr 7 08:45:48 host-192-168-31-217 systemd: Reload failed for HAProxy Load Balancer.
-----
Fix proposed to branch: master /review. openstack. org/455569
Review: https:/