RADOS object not found

Bug #2021747 reported by Guillaume Boutry
6
This bug affects 1 person
Affects Status Importance Assigned to Milestone
Juju Charmed Operator - MicroCeph
New
Undecided
Unassigned

Bug Description

Happened during bootstrapping process of snap-openstack on a fresh VM.

Bootstrapped command used:
`sunbeam -v cluster bootstrap --role compute --role control --role storage`

```
unit-microceph-0: 13:09:22 WARNING unit.microceph/0.juju-log unable to get monitor info from DNS SRV with service name: ceph-mon
2023-05-30T13:09:22.950+0000 7f64567b5640 -1 failed for service _ceph-mon._tcp
2023-05-30T13:09:22.950+0000 7f64567b5640 -1 monclient: get_monmap_and_config cannot identify monitors to contact
[errno 2] RADOS object not found (error connecting to the cluster)

unit-microceph-0: 13:09:22 ERROR unit.microceph/0.juju-log Exception raised in section 'Bootstrapping': Command '['ceph', 'config', 'set', 'global', 'mon_allow_pool_size_one', 'true']' returned non-zero exit status 1.
unit-microceph-0: 13:09:22 ERROR unit.microceph/0.juju-log Traceback (most recent call last):
  File "/var/lib/juju/agents/unit-microceph-0/charm/venv/ops_sunbeam/guard.py", line 91, in guard
    yield
  File "/var/lib/juju/agents/unit-microceph-0/charm/venv/ops_sunbeam/charm.py", line 268, in configure_charm
    self.configure_app(event)
  File "/var/lib/juju/agents/unit-microceph-0/charm/venv/ops_sunbeam/charm.py", line 255, in configure_app
    self.configure_app_leader(event)
  File "/var/lib/juju/agents/unit-microceph-0/charm/./src/charm.py", line 242, in configure_app_leader
    self.configure_ceph()
  File "/var/lib/juju/agents/unit-microceph-0/charm/./src/charm.py", line 346, in configure_ceph
    raise e
  File "/var/lib/juju/agents/unit-microceph-0/charm/./src/charm.py", line 338, in configure_ceph
    process = subprocess.run(
  File "/usr/lib/python3.10/subprocess.py", line 524, in run
    raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['ceph', 'config', 'set', 'global', 'mon_allow_pool_size_one', 'true']' returned non-zero exit status 1.
```

To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.