Hypervisor summary shows incorrect total storage

Bug #1359989 reported by Alex Madama
46
This bug affects 8 people
Affects Status Importance Assigned to Milestone
Mirantis OpenStack
Won't Fix
High
Sergey Nikitin
Nominated for 10.0.x by Roman Podoliaka
5.0.x
Won't Fix
High
Sergey Nikitin
5.1.x
Won't Fix
High
Sergey Nikitin
6.0.x
Won't Fix
High
Sergey Nikitin
6.1.x
Won't Fix
High
Sergey Nikitin
7.0.x
Won't Fix
High
Sergey Nikitin
8.0.x
Won't Fix
High
MOS Nova
9.x
Won't Fix
High
MOS Nova

Bug Description

Fuel 5.1 (master-84)
Openstack deployment Centos, Multi-node with HA, Neutron with vlan, three nodes for Ceph OSD, ceph for ephemeral, images, block and object.

On Horizon UI in Admin/Hypervisors, Disk Usage shows incorrect value.
Since using Ceph for ephemeral storage it adds up the ceph storage seen in each storage node rather than just using the real amount of ceph storage.

Revision history for this message
Alex Madama (amadama) wrote :
Changed in fuel:
importance: Undecided → High
no longer affects: fuel
Changed in mos:
importance: Undecided → High
assignee: nobody → MOS Horizon (mos-horizon)
milestone: none → 5.1
Revision history for this message
Timur Sufiev (tsufiev-x) wrote :

To obtain all hypervisor statistics Horizon uses a single call at https://github.com/openstack/horizon/blob/2014.1.2/openstack_dashboard/api/nova.py#L678 which is in turn calling nova endpoint (https://github.com/openstack/python-novaclient/blob/2.18.1/novaclient/v1_1/hypervisors.py#L73). Thus it is Nova bug.

Changed in mos:
assignee: MOS Horizon (mos-horizon) → MOS Nova (mos-nova)
Revision history for this message
Dmitry Mescheryakov (dmitrymex) wrote :

The issue does not really break anything, hence lowering priority to medium. Also since we already reached soft code freeze for 5.1, I am bumping it to 6.0 release.

Changed in mos:
importance: High → Medium
milestone: 5.1 → 6.0
Changed in mos:
status: New → Triaged
tags: added: nova
tags: added: release-notes
Revision history for this message
Roman Podoliaka (rpodolyaka) wrote :

I suggest that we add the following text to the known issues list:

When Ceph is used as a backend for ephemeral storage, in Horizon UI in Admin/Hypervisors Disk Usage shows incorrect value: it adds up the Ceph storage seen in each storage node rather than just using the real amount of ceph storage.

Changed in mos:
assignee: MOS Nova (mos-nova) → Sergey Nikitin (snikitin)
Revision history for this message
Dmitry Borodaenko (angdraug) wrote :

This is definitely not High priority, so shouldn't be targeted for 5.x.

Revision history for this message
OSCI Robot (oscirobot) wrote :

Package nova-2014.2-fuel6.0.mira10.git.b81d6b1.7f687ef has been built from changeset: https://review.fuel-infra.org/511
RPM Repository URL: http://osci-obs.vm.mirantis.net:82/centos-fuel-6.0-stable-511/centos

Revision history for this message
OSCI Robot (oscirobot) wrote :

Package nova-2014.2-fuel6.0~mira10+git.b81d6b1.7f687ef has been built from changeset: https://review.fuel-infra.org/511
DEB Repository URL: http://osci-obs.vm.mirantis.net:82/ubuntu-fuel-6.0-stable-511/ubuntu

Revision history for this message
Dmitry Borodaenko (angdraug) wrote :

High priority customer-found bug was marked a duplicate of this, raising priority to match.

tags: added: customer-found
Revision history for this message
Roman Podoliaka (rpodolyaka) wrote :

Should be added to release notes:

"Nova hypervisor stats (CLI - nova hypervisor-stats, Horizon - hypervisors page) report is misleading when shared storage backend (Ceph) is used: the actual amount of space available/used is multiplied by the number of compute nodes. Note, that this does not affect booting of instances in any way, but only confuses the operator checking the resources usage report."

Revision history for this message
Nha Pham (phqnha) wrote :

I think this bug is the same with
https://bugs.launchpad.net/nova/+bug/1252321

I have uploaded a patch:
https://review.openstack.org/#/c/155184/

Revision history for this message
Dmitry Mescheryakov (dmitrymex) wrote :

Too late to fix this in 6.0.1, moving to 6.0.2

Revision history for this message
Sergey Nikitin (snikitin) wrote :

Currently we have same bug in upstream https://bugs.launchpad.net/nova/+bug/1252321. But during review of the bug fix in upstream we couldn't find a right solution. So we need to create a spec in upstream with discussion to find a solution, because this is not a trivial problem.
But Kilo release is too close and we can't do it in Kilo. We have to move it to the Liberty.
Fuel release is close too, so we can't fix it in fuel 6.0.2 and 6.1 because if we fix it in mos, we will get a big difference between MOS and upstream code (because this bug is not trivial and maybe upstream bug fix will be rather different than our bug fix).

Changed in mos:
status: Confirmed → Triaged
Changed in mos:
status: Triaged → Won't Fix
status: Won't Fix → Triaged
milestone: 6.0.1 → 7.0
Revision history for this message
Dmitry Mescheryakov (dmitrymex) wrote :

The issue is targeted only to 7.0 because it requires significant efforts to fix. The work is currently done in upstream.

tags: added: needs-bp
tags: added: release-notes-done
removed: release-notes
Revision history for this message
Roman Podoliaka (rpodolyaka) wrote :

We can't land this into Kilo upstream, Sergey to create a backlog ticket.

Changed in mos:
status: Triaged → Won't Fix
Revision history for this message
Roman Podoliaka (rpodolyaka) wrote :
Revision history for this message
Roman Podoliaka (rpodolyaka) wrote :

^ means we can't do it earlier downstream, as neither we, nor upstream back port features.

tags: added: enhancement
Changed in mos:
status: Opinion → Won't Fix
Revision history for this message
Mike Scherbakov (mihgen) wrote :

We need to keep this open for 9.0 then, until we can provide a link to a patchset which closes this particular issue.

tags: added: wontfix-feature
Revision history for this message
Roman Podoliaka (rpodolyaka) wrote :

Unfortunately, this will not land in Mitaka either: https://review.openstack.org/#/c/253187/

To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Duplicates of this bug

Other bug subscribers

Bug attachments

Remote bug watches

Bug watches keep track of this bug in other bug trackers.