Unable to juju upgrade-series from xenial to bionic without manual intervention
Affects | Status | Importance | Assigned to | Milestone | |
---|---|---|---|---|---|
Canonical Livepatch Charm |
Won't Fix
|
Medium
|
Zachary Zehring |
Bug Description
juju upgrade-series from xenial to bionic fails.
deploy a xenial instance:
juju deploy --series=xenial cs:ubuntu
juju deploy --series=xenial cs:canonical-
juju add-relation ubuntu canonical-livepatch
Upgrade series:
juju upgrade-series 0 prepare bionic
juju ssh 0
do-release-uprade
Post upgrade reboot, set complete to run the completion juju hooks:
juju upgrade-series 0 complete
This will hang running hooks, and juju status shows start hook errors:
canonical-
juju logs show:
2020-06-15 15:42:41 ERROR juju.worker.
2020-06-15 15:42:52 ERROR juju.worker.
2020-06-15 15:43:13 ERROR juju.worker.
2020-06-15 15:43:55 ERROR juju.worker.
The workaround seems to be to nuke and recreate the venv:
moon127@hobbes:~$ juju run --application canonical-livepatch 'rm $JUJU_CHARM_
moon127@hobbes:~$ juju run --application canonical-livepatch 'rm -rf $JUJU_CHARM_
moon127@hobbes:~$ juju run --application canonical-livepatch 'hooks/
Reading package lists...
Building dependency tree...
Reading state information...
build-essential is already the newest version (12.4ubuntu1).
python3-setuptools is already the newest version (39.0.1-2).
python3-yaml is already the newest version (3.12-1build2).
python3-wheel is already the newest version (0.30.0-0.2).
python3-dev is already the newest version (3.6.7-1~18.04).
python3-pip is already the newest version (9.0.1-
0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded.
Reading package lists...
Building dependency tree...
Reading state information...
0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded.
Reading package lists...
Building dependency tree...
Reading state information...
virtualenv is already the newest version (15.1.0+ds-1.1).
0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded.
Already using interpreter /usr/bin/python3
Using base prefix '/usr'
New python executable in /var/lib/
Also creating executable in /var/lib/
Please make sure you remove any previous custom paths from your /root/.
Installing setuptools, pkg_resources, pip, wheel...done.
Collecting pip
Installing collected packages: pip
Found existing installation: pip 9.0.1
Uninstalling pip-9.0.1:
Successfully uninstalled pip-9.0.1
Successfully installed pip-18.1
Looking in links: wheelhouse
Collecting setuptools
Collecting setuptools-scm
Installing collected packages: setuptools, setuptools-scm
Found existing installation: setuptools 39.0.1
Uninstalling setuptools-39.0.1:
Successfully uninstalled setuptools-39.0.1
Successfully installed setuptools-41.6.0 setuptools-
Looking in links: wheelhouse
Processing ./wheelhouse/
Processing ./wheelhouse/
Processing ./wheelhouse/
Processing ./wheelhouse/
Processing ./wheelhouse/
Processing ./wheelhouse/
Processing ./wheelhouse/
Processing ./wheelhouse/
Processing ./wheelhouse/
Processing ./wheelhouse/
Processing ./wheelhouse/
Processing ./wheelhouse/
Processing ./wheelhouse/
Processing ./wheelhouse/
Processing ./wheelhouse/
Installing build dependencies: started
Installing build dependencies: finished with status 'done'
Collecting MarkupSafe>=0.23 (from Jinja2==2.11.1)
Collecting six (from charmhelpers=
Collecting netaddr (from charmhelpers=
Building wheels for collected packages: PyYAML, pyaml, Jinja2, MarkupSafe, setuptools-scm, Tempita, wheel, charmhelpers, netaddr, charms.reactive, setuptools, pip
Running setup.py bdist_wheel for PyYAML: started
Running setup.py bdist_wheel for PyYAML: finished with status 'done'
Stored in directory: /root/.
Running setup.py bdist_wheel for pyaml: started
Running setup.py bdist_wheel for pyaml: finished with status 'done'
Stored in directory: /root/.
Running setup.py bdist_wheel for Jinja2: started
Running setup.py bdist_wheel for Jinja2: finished with status 'done'
Stored in directory: /root/.
Running setup.py bdist_wheel for MarkupSafe: started
Running setup.py bdist_wheel for MarkupSafe: finished with status 'done'
Stored in directory: /root/.
Running setup.py bdist_wheel for setuptools-scm: started
Running setup.py bdist_wheel for setuptools-scm: finished with status 'done'
Stored in directory: /root/.
Running setup.py bdist_wheel for Tempita: started
Running setup.py bdist_wheel for Tempita: finished with status 'done'
Stored in directory: /root/.
Running setup.py bdist_wheel for wheel: started
Running setup.py bdist_wheel for wheel: finished with status 'done'
Stored in directory: /root/.
Running setup.py bdist_wheel for charmhelpers: started
Running setup.py bdist_wheel for charmhelpers: finished with status 'done'
Stored in directory: /root/.
Running setup.py bdist_wheel for netaddr: started
Running setup.py bdist_wheel for netaddr: finished with status 'done'
Stored in directory: /root/.
Running setup.py bdist_wheel for charms.reactive: started
Running setup.py bdist_wheel for charms.reactive: finished with status 'done'
Stored in directory: /root/.
Running setup.py bdist_wheel for setuptools: started
Running setup.py bdist_wheel for setuptools: finished with status 'done'
Stored in directory: /root/.
Running setup.py bdist_wheel for pip: started
Running setup.py bdist_wheel for pip: finished with status 'done'
Stored in directory: /root/.
Successfully built PyYAML pyaml Jinja2 MarkupSafe setuptools-scm Tempita wheel charmhelpers netaddr charms.reactive setuptools pip
Installing collected packages: PyYAML, pyaml, MarkupSafe, Jinja2, setuptools-scm, Tempita, wheel, six, netaddr, charmhelpers, charms.reactive, setuptools, pip
Successfully installed Jinja2-2.11.1 MarkupSafe-1.1.1 PyYAML-5.2 Tempita-0.5.2 charmhelpers-
lxc
lxc
All snaps up to date.
Changed in charm-canonical-livepatch: | |
status: | New → In Progress |
assignee: | nobody → Zachary Zehring (zzehring) |
When hitting this issue, was the machine a LXD container? I was able to reproduce this issue when deploying to LXD, but not when deploying to a VM.
As an aside, I also experienced both ubuntu and canonical-livepatch units to hang on the start hook.