OSA run: cinder related to ceph got an empty ceph.conf

Bug #1453934 reported by Andreas Hasenack
12
This bug affects 2 people
Affects Status Importance Assigned to Milestone
cinder (Juju Charms Collection)
In Progress
High
Liam Young

Bug Description

In the autopilot, we relate cinder to ceph.

In this deployment, all 3 cinder units got a basically empty /etc/ceph/ceph.conf file:
"""
###############################################################################
# [ WARNING ]
# cinder configuration file maintained by Juju
# local changes may be overwritten.
###############################################################################
[global]
log to syslog =
 err to syslog =
 clog to syslog =
"""

I'm attaching all logs.

The same happened with the glance service: all 3 glance units got the same "empty" ceph.conf file and were not working. In the glance case, I destroyed the ceph-glance relation and recreated it. After that, it started working. The logs are from before that, though.

Revision history for this message
Andreas Hasenack (ahasenack) wrote :
Revision history for this message
Andreas Hasenack (ahasenack) wrote :

juju status output

Revision history for this message
Andreas Hasenack (ahasenack) wrote :
Download full text (3.7 KiB)

cinder-volume.log also has these:
2015-05-11 19:11:56.841 37990 TRACE oslo.messaging.rpc.dispatcher Traceback (most recent call last):
2015-05-11 19:11:56.841 37990 TRACE oslo.messaging.rpc.dispatcher File "/usr/lib/python2.7/dist-packages/oslo/messaging/rpc/dispatcher.py", line 134, in _dispatch_and_
reply
2015-05-11 19:11:56.841 37990 TRACE oslo.messaging.rpc.dispatcher incoming.message))
2015-05-11 19:11:56.841 37990 TRACE oslo.messaging.rpc.dispatcher File "/usr/lib/python2.7/dist-packages/oslo/messaging/rpc/dispatcher.py", line 177, in _dispatch
2015-05-11 19:11:56.841 37990 TRACE oslo.messaging.rpc.dispatcher return self._do_dispatch(endpoint, method, ctxt, args)
2015-05-11 19:11:56.841 37990 TRACE oslo.messaging.rpc.dispatcher File "/usr/lib/python2.7/dist-packages/oslo/messaging/rpc/dispatcher.py", line 123, in _do_dispatch
2015-05-11 19:11:56.841 37990 TRACE oslo.messaging.rpc.dispatcher result = getattr(endpoint, method)(ctxt, **new_args)
2015-05-11 19:11:56.841 37990 TRACE oslo.messaging.rpc.dispatcher File "/usr/lib/python2.7/dist-packages/osprofiler/profiler.py", line 105, in wrapper
2015-05-11 19:11:56.841 37990 TRACE oslo.messaging.rpc.dispatcher return f(*args, **kwargs)
2015-05-11 19:11:56.841 37990 TRACE oslo.messaging.rpc.dispatcher File "/usr/lib/python2.7/dist-packages/cinder/volume/manager.py", line 381, in create_volume
2015-05-11 19:11:56.841 37990 TRACE oslo.messaging.rpc.dispatcher _run_flow()
2015-05-11 19:11:56.841 37990 TRACE oslo.messaging.rpc.dispatcher File "/usr/lib/python2.7/dist-packages/cinder/volume/manager.py", line 374, in _run_flow
2015-05-11 19:11:56.841 37990 TRACE oslo.messaging.rpc.dispatcher flow_engine.run()
2015-05-11 19:11:56.841 37990 TRACE oslo.messaging.rpc.dispatcher File "/usr/lib/python2.7/dist-packages/taskflow/engines/action_engine/engine.py", line 89, in run
2015-05-11 19:11:56.841 37990 TRACE oslo.messaging.rpc.dispatcher for _state in self.run_iter():
2015-05-11 19:11:56.841 37990 TRACE oslo.messaging.rpc.dispatcher File "/usr/lib/python2.7/dist-packages/taskflow/engines/action_engine/engine.py", line 137, in run_it
er
2015-05-11 19:11:56.841 37990 TRACE oslo.messaging.rpc.dispatcher misc.Failure.reraise_if_any(failures.values())
2015-05-11 19:11:56.841 37990 TRACE oslo.messaging.rpc.dispatcher File "/usr/lib/python2.7/dist-packages/taskflow/utils/misc.py", line 797, in reraise_if_any
2015-05-11 19:11:56.841 37990 TRACE oslo.messaging.rpc.dispatcher failures[0].reraise()
2015-05-11 19:11:56.841 37990 TRACE oslo.messaging.rpc.dispatcher File "/usr/lib/python2.7/dist-packages/taskflow/utils/misc.py", line 804, in reraise
2015-05-11 19:11:56.841 37990 TRACE oslo.messaging.rpc.dispatcher six.reraise(*self._exc_info)
2015-05-11 19:11:56.841 37990 TRACE oslo.messaging.rpc.dispatcher File "/usr/lib/python2.7/dist-packages/taskflow/engines/action_engine/executor.py", line 34, in _exec
ute_task
2015-05-11 19:11:56.841 37990 TRACE oslo.messaging.rpc.dispatcher result = task.execute(**arguments)
2015-05-11 19:11:56.841 37990 TRACE oslo.messaging.rpc.dispatcher File "/usr/lib/python2.7/dist-packages/cinder/volume/flows/manager/cr...

Read more...

Revision history for this message
Chris Holcombe (xfactor973) wrote :

I'm going to try and reproduce this and see where it leads me

Revision history for this message
Ryan Beisner (1chb1n) wrote :

FYI - adding bug link as it's a separate issue in the same chunk of c-h code that generates this ceph.conf file.
https://bugs.launchpad.net/charm-helpers/+bug/1468511

Revision history for this message
Chris Holcombe (xfactor973) wrote :

I have a fix for the empty config file but it still is throwing errors connecting to the cluster. Perm denied to be specific.

Revision history for this message
Chris Holcombe (xfactor973) wrote :

The permission denied error is related to the cinder client trying to create a pool. It needs mon 'allow rwx' and it only has mon 'allow rw'. We should talk to james-page about this.

Revision history for this message
Ryan Beisner (1chb1n) wrote :

Can we please get the following info on this (and any other potential ubuntu openstack charm bugs)?:

* What Ubuntu-OpenStack combo was deployed? ie. trusty-icehouse, trusty-kilo, etc.
* `juju get <service>` output on the relevant services
* `juju status --format yaml` output [OK, ALREADY ATTACHED]

This information is necessary in order to attempt to reproduce the occurrence.

Thank you.

Revision history for this message
Ryan Beisner (1chb1n) wrote :

FYI: Trusty-Juno is where the reporter observed this behavior, as indicated in the all-machines log.

Revision history for this message
Chris Holcombe (xfactor973) wrote :

ahasenack: I'm having trouble reproducing again this locally. Next time you encounter this can you attach the following:
topology (juju stat) + versions (ubuntu/openstack) + config (juju set/juju get) ?

Revision history for this message
Alberto Donato (ack) wrote :

We've seen this issue with trusty/kilo.

Revision history for this message
Liam Young (gnuoy) wrote :
Changed in cinder (Juju Charms Collection):
status: New → In Progress
importance: Undecided → High
assignee: nobody → Liam Young (gnuoy)
To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.