vcpu failed to scale up as expected after running dd on VM

Bug #1795423 reported by Peng Peng
6
This bug affects 1 person
Affects Status Importance Assigned to Milestone
StarlingX
Invalid
Medium
Tee Ngo

Bug Description

Brief Description
-----------------
Failed to use "dd" to scale up vm cup

Severity
--------
Major

Steps to Reproduce
------------------
1. Creating heat stack for auto scaling Vms
2. Create VM via heat stack for vcpu scaling
3. Nova scale vm cpu down to 2
4. scale up vm cup by running dd processes in vm

Expected Behavior
------------------
vcup increase to 3

Actual Behavior
----------------
vcup is remain 2 after 20 mins

Reproducibility
---------------
Reproducible
Same TC also failed on
Load: 2018-09-28_12-39-00
Load: 2018-09-28_20-18-00
Load: 2018-09-29_20-18-00

System Configuration
--------------------
One node system

Branch/Pull Time/Commit
-----------------------
master as of 2018-09-30_20-18-00

Timestamp/Logs
--------------
[2018-10-01 07:38:24,136] 262 DEBUG Thread-1 ssh.send :: Send 'nova --os-username 'admin' --os-password 'Li69nux*' --os-project-name admin --os-auth-url http://127.168.204.2:5000/v3 --os-user-domain-name Default --os-project-domain-name Default --os-region-name RegionOne show e8be894e-beb9-4826-915c-5a4dd5db483f'
...
| wrs-res:vcpus | [2, 2, 3]

[2018-10-01 07:38:33,837] 262 DEBUG Thread-1 ssh.send :: Send 'dd if=/dev/zero of=/dev/null &'
[2018-10-01 07:38:33,944] 382 DEBUG Thread-1 ssh.expect :: Output:
[1] 1454
]0;root@vm-cpu-scale:~vm-cpu-scale:~#
[2018-10-01 07:38:33,944] 262 DEBUG Thread-1 ssh.send :: Send 'echo $?'
[2018-10-01 07:38:34,049] 382 DEBUG Thread-1 ssh.expect :: Output:
0
]0;root@vm-cpu-scale:~vm-cpu-scale:~#
[2018-10-01 07:38:34,049] 419 DEBUG Thread-1 ssh.exec_cmd:: Executing command...
[2018-10-01 07:38:34,049] 262 DEBUG Thread-1 ssh.send :: Send 'dd if=/dev/zero of=/dev/null &'
[2018-10-01 07:38:34,156] 382 DEBUG Thread-1 ssh.expect :: Output:
[2] 1455

[2018-10-01 07:58:15,798] 262 DEBUG MainThread ssh.send :: Send 'nova --os-username 'admin' --os-password 'Li69nux*' --os-project-name admin --os-auth-url http://127.168.204.2:5000/v3 --os-user-domain-name Default --os-project-domain-name Default --os-region-name RegionOne show e8be894e-beb9-4826-915c-5a4dd5db483f'
....
| wrs-res:vcpus | [2, 2, 3]

Revision history for this message
Ghada Khalil (gkhalil) wrote :

Targeting stx.2019.03 -- not a critical issue so it's not required to port it back to stx.2018.10

Changed in starlingx:
assignee: nobody → Tee Ngo (teewrs)
tags: added: stx.2019.03 stx.config
Changed in starlingx:
status: New → Triaged
importance: Undecided → Medium
Revision history for this message
Tee Ngo (teewrs) wrote :

Before running this test case please check if vcpu_util meter exists.

Verify that the needed metric exists after the VM is launched from VMAutoScaling heat stack
>nova list --all-ten
     to get the VM UUID
>gnocchi metric list | grep <VM-UUID> |grep vcpu_util

If the required meter does not then check ceilometer-agent-notification.log for possible cause. Based on the provided logs ceilometer-agent-nofication process had been bouncing in that build.

2018-10-01T10:37:55.000 controller-0 ceilometer-agent-notification: err 2018-10-01 10:37:55.029 4124 ERROR cotyledon._utils [-] Unhandled exception: IOError: [Errno 2] No such file or directory: '/usr/lib/python2.7/site-packages/ceilometer/pipeline/data/opt/cgcs/ceilometer/18.10/pipeline.yaml'
2018-10-01 10:37:55.029 4124 ERROR cotyledon._utils Traceback (most recent call last):
2018-10-01 10:37:55.029 4124 ERROR cotyledon._utils File "/usr/lib/python2.7/site-packages/cotyledon/_utils.py", line 84, in exit_on_exception
2018-10-01 10:37:55.029 4124 ERROR cotyledon._utils yield
2018-10-01 10:37:55.029 4124 ERROR cotyledon._utils File "/usr/lib/python2.7/site-packages/cotyledon/_service.py", line 139, in _run
2018-10-01 10:37:55.029 4124 ERROR cotyledon._utils self.run()
2018-10-01 10:37:55.029 4124 ERROR cotyledon._utils File "/usr/lib/python2.7/site-packages/ceilometer/notification.py", line 189, in run
2018-10-01 10:37:55.029 4124 ERROR cotyledon._utils self.pipeline_manager = pipeline.setup_pipeline(self.conf)
2018-10-01 10:37:55.029 4124 ERROR cotyledon._utils File "/usr/lib/python2.7/site-packages/ceilometer/pipeline.py", line 905, in setup_pipeline
2018-10-01 10:37:55.029 4124 ERROR cotyledon._utils SAMPLE_TYPE)
2018-10-01 10:37:55.029 4124 ERROR cotyledon._utils File "/usr/lib/python2.7/site-packages/ceilometer/pipeline.py", line 781, in __init__
2018-10-01 10:37:55.029 4124 ERROR cotyledon._utils cfg = self.load_config(cfg_file)
2018-10-01 10:37:55.029 4124 ERROR cotyledon._utils File "/usr/lib/python2.7/site-packages/ceilometer/pipeline.py", line 667, in load_config
2018-10-01 10:37:55.029 4124 ERROR cotyledon._utils with open(self.cfg_loc) as fap:
2018-10-01 10:37:55.029 4124 ERROR cotyledon._utils IOError: [Errno 2] No such file or directory: '/usr/lib/python2.7/site-packages/ceilometer/pipeline/data/opt/cgcs/ceilometer/18.10/pipeline.yaml'
2018-10-01 10:37:55.029 4124 ERROR cotyledon._utils
2018-10-01T10:37:55.000 controller-0 ceilometer-agent-notification: info 2018-10-01 10:37:55.040 21590 INFO cotyledon._service_manager [-] Child 4124 exited with status 2

As the issue has not been reported since (perhaps code/config error has been rectified), please update the test case with the aforementioned precondition check and close this bug.

Ghada Khalil (gkhalil)
summary: - STX: vcpu failed to scale up as expected after running dd on VM
+ vcpu failed to scale up as expected after running dd on VM
Ken Young (kenyis)
tags: added: stx.2019.05
removed: stx.2019.03
Revision history for this message
Peng Peng (ppeng) wrote :

This ticket has been rejected since the root cause for this issue was
https://bugs.launchpad.net/starlingx/+bug/1798623

And that issue has been fixed, so both tickets could be closed now.

Revision history for this message
Tee Ngo (teewrs) wrote :

Update status to invalid per PV comment.

Changed in starlingx:
status: Triaged → Invalid
Ken Young (kenyis)
tags: added: stx.2.0
removed: stx.2019.05
To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.