After setting the password failed, the VM state is set to error

Bug #1746972 reported by yangjie
12
This bug affects 2 people
Affects Status Importance Assigned to Milestone
OpenStack Compute (nova)
Confirmed
Undecided
Unassigned

Bug Description

Description
===========
After we execute the command ‘nova set-password’ failed, the virtual machine state is set to error,while its power state is still running,we should not change vm_state at this situation。

Steps to reproduce
==================
1、choose a running VM which was not installed QGA,
2、execute ‘nova set-password <UUID>’
3、the command executed failed, and the vm_state turnned to ERROR.

Expected result
===============
command executed failed, vm_state still ACTIVE.

Actual result
=============
vm_state turned to ERROR.

Environment
===========
Pike
Nova16.0.4
libvirt & KVM

Logs & Configs
==============
[root@E9000slot5 ~(keystone_admin)]# nova set-password 9f9330c2-4ab4-45f1-a9f9-2770dd34cf30
New password:
Again:
ERROR (Conflict): Failed to set admin password on 9f9330c2-4ab4-45f1-a9f9-2770dd34cf30 because error setting admin password (HTTP 409) (Request-ID: req-0b533563-132e-42e2-8ef3-5665fa8e7187)

2018-02-02 18:55:06.529 35382 ERROR oslo_messaging.rpc.server [req-0b533563-132e-42e2-8ef3-5665fa8e7187 d118d43c722e4cf48be77d6cff727810 3ac8
6e9ee3014040832eae0191490b3f - default default] Exception during message handling: InstancePasswordSetFailed: Failed to set admin password on
 9f9330c2-4ab4-45f1-a9f9-2770dd34cf30 because error setting admin password

[root@E9000slot5 ~(keystone_admin)]# nova list
+--------------------------------------+--------+--------+------------+-------------+------------------+
| ID | Name | Status | Task State | Power State | Networks |
+--------------------------------------+--------+--------+------------+-------------+------------------+
| 9f9330c2-4ab4-45f1-a9f9-2770dd34cf30 | testyj | ERROR | - | Running | test=192.168.3.9 |
+--------------------------------------+--------+--------+------------+-------------+------------------+

Tags: compute
Revision history for this message
Matt Riedemann (mriedem) wrote :

Which version of nova are you using?

tags: added: compute
Revision history for this message
Matt Riedemann (mriedem) wrote :

We must have gotten here:

https://github.com/openstack/nova/blob/8ac8c995ca2ca2613dbcdbe6c82d771b279e0d93/nova/compute/manager.py#L3487

What is the exception traceback in the nova-compute logs when you hit this?

We should have gotten here if the driver was raising the correct exception:

https://github.com/openstack/nova/blob/8ac8c995ca2ca2613dbcdbe6c82d771b279e0d93/nova/compute/manager.py#L3455

Revision history for this message
yangjie (yang.jie) wrote :
Download full text (6.1 KiB)

Environment
===========
Pike

libvirt & KVM

exception traceback:

2018-02-02 18:55:06.529 35382 ERROR oslo_messaging.rpc.server [req-0b533563-132e-42e2-8ef3-5665fa8e7187 d118d43c722e4cf48be77d6cff727810 3ac8
6e9ee3014040832eae0191490b3f - default default] Exception during message handling: InstancePasswordSetFailed: Failed to set admin password on
 9f9330c2-4ab4-45f1-a9f9-2770dd34cf30 because error setting admin password
2018-02-02 18:55:06.529 35382 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
2018-02-02 18:55:06.529 35382 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/server.py", line 16
0, in _process_incoming
2018-02-02 18:55:06.529 35382 ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message)
2018-02-02 18:55:06.529 35382 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", lin
e 213, in dispatch
2018-02-02 18:55:06.529 35382 ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args)
2018-02-02 18:55:06.529 35382 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", lin
e 183, in _do_dispatch
2018-02-02 18:55:06.529 35382 ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args)
2018-02-02 18:55:06.529 35382 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/nova/exception_wrapper.py", line 76, i
n wrapped
2018-02-02 18:55:06.529 35382 ERROR oslo_messaging.rpc.server function_name, call_dict, binary)
2018-02-02 18:55:06.529 35382 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 220, in
__exit__
2018-02-02 18:55:06.529 35382 ERROR oslo_messaging.rpc.server self.force_reraise()
2018-02-02 18:55:06.529 35382 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 196, in
force_reraise
2018-02-02 18:55:06.529 35382 ERROR oslo_messaging.rpc.server six.reraise(self.type_, self.value, self.tb)
2018-02-02 18:55:06.529 35382 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/nova/exception_wrapper.py", line 67, i
n wrapped
2018-02-02 18:55:06.529 35382 ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw)
2018-02-02 18:55:06.529 35382 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/nova/compute/manager.py", line 194, in
 decorated_function
2018-02-02 18:55:06.529 35382 ERROR oslo_messaging.rpc.server "Error: %s", e, instance=instance)
2018-02-02 18:55:06.529 35382 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 220, in
__exit__
2018-02-02 18:55:06.529 35382 ERROR oslo_messaging.rpc.server self.force_reraise()
2018-02-02 18:55:06.529 35382 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 196, in
force_reraise
2018-02-02 18:55:06.529 35382 ERROR oslo_messaging.rpc.server six.reraise(self.type_, self.value, self.tb)
2018-02-02 18:55:06.529 35382 ERROR oslo_messaging.rpc.server File "/usr/lib/pyth...

Read more...

Ameed Ashour (ameeda)
Changed in nova:
assignee: nobody → Ameed Ashour (ameeda)
Revision history for this message
Ameed Ashour (ameeda) wrote :

Hello

I tried to reproduce this bug. Actually I got this error message

$nova set-password 838a5216-13dd-4c91-a541-73c3f6e34b5d
New password:
Again:
ERROR (Conflict): QEMU guest agent is not enabled (HTTP 409) (Request-ID: req-3c6613e1-d106-4d96-9398-dc54c0191469)

and the log is here:
https://pasteboard.co/H6o9Kqd.png

master/queens

Revision history for this message
Sylvain Bauza (sylvain-bauza) wrote :

You need to set the image metadata property hw_qemu_guest_agent to True in order to tell libvirt that the QEMU guest agent is installed.

https://github.com/openstack/nova/blob/b37aa5415378c2d814e7bc6ae115c1e58705d4bf/nova/virt/libvirt/driver.py#L1970

Changed in nova:
status: New → Invalid
Revision history for this message
s10 (vlad-esten) wrote :

We also have faced this bug in some situations. I believe, that nova shouldn't set instance to the ERROR state, because only password change operation has failed and nothing destructive has been done to the instance.

Changed in nova:
status: Invalid → Confirmed
Revision history for this message
s10 (vlad-esten) wrote :
Revision history for this message
s10 (vlad-esten) wrote :

This could happen if qemu-quest-agent is running inside VM, but for some reason, it fails to change a password and produces an error like "{"error": {"class": "GenericError", "desc": "child process has failed to set user password"}}"

Revision history for this message
s10 (vlad-esten) wrote :

How to reproduce this error:
1. Create an instance from the image with property hw_qemu_guest_agent=yes and installed qemu-guest-agent inside.
2. Stop qemu-guest-agent inside this instance.
```
...
root 1450 0.0 0.0 21972 188 ? Ss 14:02 0:00 /usr/sbin/qemu-ga --daemonize -m virtio-serial -p /dev/virtio-ports/org.qemu.guest_agent.0
...
root@francoise:~# service qemu-guest-agent stop
```
3. Try to change password with ``nova set-password``.
4. See, that instance goes to ERROR state, but continues to run.
Trace: http://paste.openstack.org/show/731375/

Changed in nova:
assignee: Ameed Ashour (ameeda) → nobody
Revision history for this message
Matt Riedemann (mriedem) wrote :

Since there is a patch up for bug 1757061 which is the same issue, I'm going to duplicate this bug against that bug.

To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.