euca-attach-volume fails when using iSCSI on XenServer

Bug #745340 reported by Renuka Apte
10
This bug affects 2 people
Affects Status Importance Assigned to Milestone
OpenStack Compute (nova)
Fix Released
Medium
Renuka Apte

Bug Description

I am trying to create/attach volumes using iSCSI on Xenserver. I followed the instructions here: http://docs.openstack.org/openstack-compute/admin/content/ch05s07.html#d5e579

When I run euca-attach-volume, I see the following trace in the nova-compute logs:

2011-03-29 17:13:42,040 ERROR nova.compute.manager [6N1A9AUKLB00N9VUQCL9 admin admin] instance 1: attach failed /dev/sda, removing
(nova.compute.manager): TRACE: Traceback (most recent call last):
(nova.compute.manager): TRACE: File "/usr/lib/python2.6/site-packages/nova/compute/manager.py", line 531, in attach_volume
(nova.compute.manager): TRACE: mountpoint)
(nova.compute.manager): TRACE: File "/usr/lib/python2.6/site-packages/nova/virt/xenapi_conn.py", line 211, in attach_volume
(nova.compute.manager): TRACE: mountpoint)
(nova.compute.manager): TRACE: File "/usr/lib/python2.6/site-packages/nova/virt/xenapi/volumeops.py", line 55, in attach_volume
(nova.compute.manager): TRACE: vol_rec = VolumeHelper.parse_volume_info(device_path, mountpoint)
(nova.compute.manager): TRACE: File "/usr/lib/python2.6/site-packages/nova/virt/xenapi/volume_utils.py", line 165, in parse_volume_info
(nova.compute.manager): TRACE: (iscsi_name, iscsi_portal) = _get_target(volume_id)
(nova.compute.manager): TRACE: File "/usr/lib/python2.6/site-packages/nova/virt/xenapi/volume_utils.py", line 251, in _get_target
(nova.compute.manager): TRACE: volume_id)
(nova.compute.manager): TRACE: File "/usr/lib/python2.6/site-packages/nova/db/api.py", line 711, in volume_get_by_ec2_id
(nova.compute.manager): TRACE: return IMPL.volume_get_by_ec2_id(context, ec2_id)
(nova.compute.manager): TRACE: File "/usr/lib/python2.6/site-packages/nova/utils.py", line 368, in __getattr__
(nova.compute.manager): TRACE: return getattr(backend, key)
(nova.compute.manager): TRACE: AttributeError: 'module' object has no attribute 'volume_get_by_ec2_id'
(nova.compute.manager): TRACE:
2011-03-29 17:13:42,040 ERROR nova.compute.manager [6N1A9AUKLB00N9VUQCL9 admin admin] instance 1: attach failed /dev/sda, removing
(nova.compute.manager): TRACE: Traceback (most recent call last):
(nova.compute.manager): TRACE: File "/usr/lib/python2.6/site-packages/nova/compute/manager.py", line 531, in attach_volume
(nova.compute.manager): TRACE: mountpoint)
(nova.compute.manager): TRACE: File "/usr/lib/python2.6/site-packages/nova/virt/xenapi_conn.py", line 211, in attach_volume
(nova.compute.manager): TRACE: mountpoint)
(nova.compute.manager): TRACE: File "/usr/lib/python2.6/site-packages/nova/virt/xenapi/volumeops.py", line 55, in attach_volume
(nova.compute.manager): TRACE: vol_rec = VolumeHelper.parse_volume_info(device_path, mountpoint)
(nova.compute.manager): TRACE: File "/usr/lib/python2.6/site-packages/nova/virt/xenapi/volume_utils.py", line 165, in parse_volume_info
(nova.compute.manager): TRACE: (iscsi_name, iscsi_portal) = _get_target(volume_id)
(nova.compute.manager): TRACE: File "/usr/lib/python2.6/site-packages/nova/virt/xenapi/volume_utils.py", line 251, in _get_target
(nova.compute.manager): TRACE: volume_id)
(nova.compute.manager): TRACE: File "/usr/lib/python2.6/site-packages/nova/db/api.py", line 711, in volume_get_by_ec2_id
(nova.compute.manager): TRACE: return IMPL.volume_get_by_ec2_id(context, ec2_id)
(nova.compute.manager): TRACE: File "/usr/lib/python2.6/site-packages/nova/utils.py", line 368, in __getattr__
(nova.compute.manager): TRACE: return getattr(backend, key)
(nova.compute.manager): TRACE: AttributeError: 'module' object has no attribute 'volume_get_by_ec2_id'
(nova.compute.manager): TRACE:
2011-03-29 17:13:42,077 ERROR nova.root [-] Exception during message handling
(nova.root): TRACE: Traceback (most recent call last):
(nova.root): TRACE: File "/usr/lib/python2.6/site-packages/nova/rpc.py", line 192, in receive
(nova.root): TRACE: rval = node_func(context=ctxt, **node_args)
(nova.root): TRACE: File "/usr/lib/python2.6/site-packages/nova/compute/manager.py", line 92, in decorated_function
(nova.root): TRACE: function(self, context, instance_id, *args, **kwargs)
(nova.root): TRACE: File "/usr/lib/python2.6/site-packages/nova/compute/manager.py", line 544, in attach_volume
(nova.root): TRACE: raise exc
(nova.root): TRACE: AttributeError: 'module' object has no attribute 'volume_get_by_ec2_id'
(nova.root): TRACE:
2011-03-29 17:13:42,077 ERROR nova.root [-] Exception during message handling
(nova.root): TRACE: Traceback (most recent call last):
(nova.root): TRACE: File "/usr/lib/python2.6/site-packages/nova/rpc.py", line 192, in receive
(nova.root): TRACE: rval = node_func(context=ctxt, **node_args)
(nova.root): TRACE: File "/usr/lib/python2.6/site-packages/nova/compute/manager.py", line 92, in decorated_function
(nova.root): TRACE: function(self, context, instance_id, *args, **kwargs)
(nova.root): TRACE: File "/usr/lib/python2.6/site-packages/nova/compute/manager.py", line 544, in attach_volume
(nova.root): TRACE: raise exc
(nova.root): TRACE: AttributeError: 'module' object has no attribute 'volume_get_by_ec2_id'
(nova.root): TRACE:

Related branches

Revision history for this message
Thierry Carrez (ttx) wrote :

What version are you running ?

Changed in nova:
status: New → Incomplete
Revision history for this message
Armando Migliaccio (armando-migliaccio) wrote :

it was tested on an old revision of nova, but it looks like the implementation of volume_get_by_ec2_id that should be in /nova/db/sqlalchemy/api.py is no longer available.

This happens in the latest nova trunk also (revno 894)

Thierry Carrez (ttx)
Changed in nova:
importance: Undecided → Medium
status: Incomplete → Confirmed
Thierry Carrez (ttx)
Changed in nova:
assignee: nobody → Renuka Apte (renuka-apte)
status: Confirmed → In Progress
Changed in nova:
status: In Progress → Fix Committed
Thierry Carrez (ttx)
Changed in nova:
milestone: none → diablo-1
Thierry Carrez (ttx)
Changed in nova:
milestone: diablo-1 → 2011.3
status: Fix Committed → Fix Released
To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.