Horizon current 2M huge page value set appears to be larger than the context help 2M Max value returned from horizon (vm_hugepages_possible_2M)
Affects | Status | Importance | Assigned to | Milestone | |
---|---|---|---|---|---|
StarlingX |
Fix Released
|
Medium
|
Tao Liu |
Bug Description
Horizon current 2M huge page value set appears to be larger than the context help 2M Max value returned from horizon (vm_hugepages_
Steps to Reproduce
------------------
1. On a newly installed system where no instances exist, log into Horizon and lock the compute host
eg. compute-1
2. Navigate to the Memory tab and click Update Memory to open the dialog.
Hover over the context help for eg. node 0 # of VM 2M Hugepages Node 0
3. Compare the value returned with the output (per node) for vm_hp_total_2M (and vm_hp_avail_2M) from system host-memory-list with the 2M values nova hypervisor-show <hostid> 'memory_mb_node'
!Note the current 2M huge page value set appears to be larger than the context help value returned from horizon in step 2.
My understanding is that the context value is vm_hugepages_
4. In the Update Memory Allocation dialog, attempt to modify the 2M huge page value to something beyond the current and beyond the max for that node eg.to 29999.
Save
For eg. 2M memory current setting for node-0 is 27420
Max 2M memory setting according to Horizon is 27375 (why smaller)
attempt to set node-0 2M memory beyond max eg. try 27999
Error: Processor 0:No available space for 2M huge page allocation, max 2M pages: 27375
Expected Behavior
------------------
Consistency. The maximum setting value should match between context help Maximum 2M pages, error message maximum reported and that maximum which is allowed to be set
Actual Behavior
----------------
Result:
The Max 2M memory returned (vm_hugepages_
Reproducibility
---------------
reproduced (Horizon)
System Configuration
-------
Std 2 controller + X computes
Branch/Pull Time/Commit
-------
master as of 2018-09-19_21-38-00
Branch/Pull Time/Commit
-------
Timestamp/Logs
--------------
description: | updated |
tags: |
added: stx.2019.05 removed: stx.2019.03 |
tags: |
added: stx.2.0 removed: stx.2019.05 |
Targeting stx.2019.03 as this appears to be a minor issue