Distributed Cloud: Better error handling in Horizon navigation following swact operation

Bug #1799711 reported by Wendy Mitchell
6
This bug affects 1 person
Affects Status Importance Assigned to Milestone
StarlingX
Won't Fix
Low
Tyler Smith

Bug Description

Brief Description
-----------------
Better error handling in Horizon navigation following swact operation

Severity
--------
Normal

Steps to Reproduce
------------------
Login to horizon as admin (system controller)
Select subcloud context eg. admin (subcloud-4)
Navigate to Platform - Host Inventory
2 controllers are listed and enabled/available
Initiate a swact in horizon on the active controller
Navigate eg. to Provider Networks panel

Expected Behavior
------------------
Better handling in the case of swact operation in Horizon

Actual Behavior
----------------
admin on Subcloud-4 performs swact.
Navigation on swact results in errors in horizon
eg.
Error: Unable to retrieve alarm summary
Error: Connection to neutron failed %(reason)s

Reproducibility
---------------
Reproducible

System Configuration
--------------------
Dist. Cloud system

Branch/Pull Time/Commit
-----------------------
StarlingX_18.10 release branch as of 2018-10-18_01-52-00

Timestamp/Logs
--------------

see horizon.log
2018-10-23 21:27:52,959 [INFO] horizon.operation_log: [admin 7ba14af8570c46afaa1859de63c53899] [admin 363bf89ae9584f74ac7bfb93228e9189] [POST /admin/inventory/ 302] parameters:[{"action": "hostscontroller__swact__2", "csrfmiddlewaretoken": "F85pa3csffCLFr9PD5agiqjfhdBaFfKy"}] message:[success: Swact Initiated Host: controller-1]
2018-10-23 21:28:03,908 [ERROR] openstack_dashboard.dashboards.admin.aggregates.panel: Call to list supported extensions failed. This is likely due to a problem communicating with the Nova endpoint.
2018-10-23 21:28:04,380 [WARNING] horizon.exceptions: Recoverable error: Error finding address for http://10.10.2.2:18002/v1/alarms/summary: HTTPConnectionPool(host='10.10.2.2', port=18002): Max retries exceeded with url: /v1/alarms/summary (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7f0898c74290>: Failed to establish a new connection: [Errno 111] ECONNREFUSED',))
2018-10-23 21:28:10,225 [WARNING] horizon.exceptions: Recoverable error: Connection to neutron failed: %(reason)s
2018-10-23 21:28:10,225 [ERROR] django.request: Internal Server Error: /admin/host_topology/
Traceback (most recent call last):
  File "/usr/lib/python2.7/site-packages/django/core/handlers/base.py", line 132, in get_response
    response = wrapped_callback(request, *callback_args, **callback_kwargs)
  File "/usr/lib/python2.7/site-packages/horizon/decorators.py", line 52, in dec
    return view_func(request, *args, **kwargs)
  File "/usr/lib/python2.7/site-packages/horizon/decorators.py", line 36, in dec
    return view_func(request, *args, **kwargs)
  File "/usr/lib/python2.7/site-packages/horizon/decorators.py", line 84, in dec
    return view_func(request, *args, **kwargs)
  File "/usr/lib/python2.7/site-packages/horizon/decorators.py", line 52, in dec
    return view_func(request, *args, **kwargs)
  File "/usr/lib/python2.7/site-packages/horizon/decorators.py", line 36, in dec
    return view_func(request, *args, **kwargs)
  File "/usr/lib/python2.7/site-packages/horizon/decorators.py", line 84, in dec
    return view_func(request, *args, **kwargs)
  File "/usr/lib/python2.7/site-packages/django/views/generic/base.py", line 71, in view
    return self.dispatch(request, *args, **kwargs)
  File "/usr/lib/python2.7/site-packages/django/views/generic/base.py", line 89, in dispatch
    return handler(request, *args, **kwargs)
  File "/usr/lib/python2.7/site-packages/django/views/generic/base.py", line 158, in get
    context = self.get_context_data(**kwargs)
  File "/usr/lib/python2.7/site-packages/starlingx_dashboard/dashboards/admin/host_topology/views.py", line 147, in get_context_data
    context['instance_quota_exceeded'] = self._quota_exceeded('instances')
  File "/usr/lib/python2.7/site-packages/starlingx_dashboard/dashboards/admin/host_topology/views.py", line 138, in _quota_exceeded
    usages = quotas.tenant_quota_usages(self.request)
  File "/usr/lib/python2.7/site-packages/horizon/utils/memoized.py", line 88, in wrapped
    value = cache[key] = func(*args, **kwargs)
  File "/usr/share/openstack-dashboard/openstack_dashboard/usage/quotas.py", line 459, in tenant_quota_usages
    disabled_quotas = get_disabled_quotas(request)
  File "/usr/share/openstack-dashboard/openstack_dashboard/usage/quotas.py", line 273, in get_disabled_quotas
    if neutron.is_extension_supported(request, 'security-group'):
  File "/usr/lib/python2.7/site-packages/horizon/utils/memoized.py", line 88, in wrapped
    value = cache[key] = func(*args, **kwargs)
  File "/usr/share/openstack-dashboard/openstack_dashboard/api/neutron.py", line 1568, in is_extension_supported
    extensions = list_extensions(request)
  File "/usr/lib/python2.7/site-packages/horizon/utils/memoized.py", line 88, in wrapped
    value = cache[key] = func(*args, **kwargs)
  File "/usr/share/openstack-dashboard/openstack_dashboard/api/neutron.py", line 1556, in list_extensions
    extensions_list = neutronclient(request).list_extensions()
  File "/usr/lib/python2.7/site-packages/neutronclient/v2_0/client.py", line 817, in list_extensions
    return self.get(self.extensions_path, params=_params)
  File "/usr/lib/python2.7/site-packages/neutronclient/v2_0/client.py", line 355, in get
    headers=headers, params=params)
  File "/usr/lib/python2.7/site-packages/neutronclient/v2_0/client.py", line 332, in retry_request
    headers=headers, params=params)
  File "/usr/lib/python2.7/site-packages/neutronclient/v2_0/client.py", line 283, in do_request
    resp, replybody = self.httpclient.do_request(action, method, body=body)
  File "/usr/lib/python2.7/site-packages/neutronclient/client.py", line 199, in do_request
    **kwargs)
  File "/usr/lib/python2.7/site-packages/neutronclient/client.py", line 116, in _cs_request
    raise exceptions.ConnectionFailed(reason=e)
ConnectionFailed: Connection to neutron failed: %(reason)s
2018-10-23 21:28:10,244 [WARNING] horizon.exceptions: Recoverable error: Error finding address for http://10.10.2.2:18002/v1/alarms/summary: HTTPConnectionPool(host='10.10.2.2', port=18002): Max retries exceeded with url: /v1/alarms/summary (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7f08990e8950>: Failed to establish a new connection: [Errno 111] ECONNREFUSED',))
2018-10-23 21:28:15,250 [WARNING] horizon.exceptions: Recoverable error: Connection to neutron failed: %(reason)s
2018-10-23 21:28:15,270 [WARNING] horizon.exceptions: Recoverable error: Error finding address for http://10.10.2.2:18002/v1/alarms/summary: HTTPConnectionPool(host='10.10.2.2', port=18002): Max retries exceeded with url: /v1/alarms/summary (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7f0899745fd0>: Failed to establish a new connection: [Errno 111] ECONNREFUSED',))
2018-10-23 21:28:21,197 [WARNING] horizon.exceptions: Recoverable error: Connection to neutron failed: %(reason)s
2018-10-23 21:28:25,013 [WARNING] horizon.exceptions: Recoverable error: HTTP Server Error (HTTP 500) (Request-ID: req-5ebfe034-2f2a-4600-ad01-21476b1bff0a)

Revision history for this message
Ghada Khalil (gkhalil) wrote :

Low priority issue as the user is able to re-connect. Fix is best effort; no specific target release.

Changed in starlingx:
importance: Undecided → Low
status: New → Triaged
tags: added: stx.gui
Changed in starlingx:
assignee: nobody → Tyler Smith (tyler.smith)
Ghada Khalil (gkhalil)
summary: - STX: Distributed Cloud: Better error handling in Horizon navigation
- following swact operation
+ Distributed Cloud: Better error handling in Horizon navigation following
+ swact operation
Ghada Khalil (gkhalil)
tags: added: stx.distcloud
Revision history for this message
Ghada Khalil (gkhalil) wrote :

There are significant changes required to support Distributed Cloud in stx due to the containerized environment. Based on discussion with the Distributed Cloud PL, Dariush Eslimi, we agreed to close old Distributed Cloud bugs as Won't Fix. A full test cycle will be needed to verify the new implementation and new bugs can be opened for the issues reported then.

Changed in starlingx:
status: Triaged → Won't Fix
To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.