juju run hangs on k8s workload charm

Bug #1884868 reported by Paul Collins
8
This bug affects 1 person
Affects Status Importance Assigned to Milestone
Canonical Juju
Expired
Undecided
Unassigned

Bug Description

I tried to run `juju run --unit mattermost/80 -- state-get` on a k8s model just now, and it hung.

`juju debug-log` reveals the following:

application-mattermost: 2020-06-24 14:38:51 DEBUG juju.worker.uniter.remotestate got action change: [9] ok=true
application-mattermost: 2020-06-24 14:38:51 DEBUG juju.worker.uniter.operation running operation run action 9
application-mattermost: 2020-06-24 14:38:51 DEBUG juju.machinelock acquire machine lock for uniter (run action 9)
application-mattermost: 2020-06-24 14:38:51 DEBUG juju.machinelock machine lock acquired for uniter (run action 9)
application-mattermost: 2020-06-24 14:38:51 DEBUG juju.worker.uniter.operation preparing operation "run action 9"
application-mattermost: 2020-06-24 14:38:51 DEBUG juju.machinelock machine lock released for uniter (run action 9)
application-mattermost: 2020-06-24 14:38:51 DEBUG juju.worker.uniter.operation lock released
application-mattermost: 2020-06-24 14:38:51 ERROR juju.worker.uniter resolver loop error: preparing operation "run action 9": cannot create runner for action "9": stat /var/lib/juju/agents/unit-mattermost-80/charm: no such file or directory
application-mattermost: 2020-06-24 14:38:51 DEBUG juju.worker.uniter [AGENT-STATUS] failed: resolver loop error
application-mattermost: 2020-06-24 14:38:51 INFO juju.worker.uniter unit "mattermost/80" shutting down: preparing operation "run action 9": cannot create runner for action "9": stat /var/lib/juju/agents/unit-mattermost-80/charm: no such file or directory
application-mattermost: 2020-06-24 14:38:51 DEBUG juju.worker.uniter juju-run listener stopping
application-mattermost: 2020-06-24 14:38:51 DEBUG juju.worker.uniter juju-run listener stopped
application-mattermost: 2020-06-24 14:38:51 INFO juju.worker.caasoperator stopped "mattermost/80", err: preparing operation "run action 9": cannot create runner for action "9": stat /var/lib/juju/agents/unit-mattermost-80/charm: no such file or directory
application-mattermost: 2020-06-24 14:38:51 DEBUG juju.worker.caasoperator "mattermost/80" done: preparing operation "run action 9": cannot create runner for action "9": stat /var/lib/juju/agents/unit-mattermost-80/charm: no such file or directory
application-mattermost: 2020-06-24 14:38:51 ERROR juju.worker.caasoperator exited "mattermost/80": preparing operation "run action 9": cannot create runner for action "9": stat /var/lib/juju/agents/unit-mattermost-80/charm: no such file or directory
application-mattermost: 2020-06-24 14:38:51 INFO juju.worker.caasoperator restarting "mattermost/80" in 3s

I restarted the controller, and the unit dropped into failed:

mattermost/80 active failed 10.1.1.140 8065/TCP

(Note that I also have another bug, LP:1882600, that seems related to a missing ${unit_dir}/charm directory.)

Revision history for this message
John A Meinel (jameinel) wrote : Re: [Bug 1884868] [NEW] juju run hangs on k8s workload charm
Download full text (3.7 KiB)

'juju run' isn't supported in older k8s deployments, as there is no SSH
running for it to connect to. That said, hanging is a terrible experience,
and we should be failing fast rather than having it try but not work. I
think there is work going on to get 'juju run' working on K8s workloads,
but I don't think it is planned for 2.8.

On Tue, Jun 23, 2020 at 9:55 PM Paul Collins <email address hidden>
wrote:

> Public bug reported:
>
> I tried to run `juju run --unit mattermost/80 -- state-get` on a k8s
> model just now, and it hung.
>
> `juju debug-log` reveals the following:
>
> application-mattermost: 2020-06-24 14:38:51 DEBUG
> juju.worker.uniter.remotestate got action change: [9] ok=true
> application-mattermost: 2020-06-24 14:38:51 DEBUG
> juju.worker.uniter.operation running operation run action 9
> application-mattermost: 2020-06-24 14:38:51 DEBUG juju.machinelock acquire
> machine lock for uniter (run action 9)
> application-mattermost: 2020-06-24 14:38:51 DEBUG juju.machinelock machine
> lock acquired for uniter (run action 9)
> application-mattermost: 2020-06-24 14:38:51 DEBUG
> juju.worker.uniter.operation preparing operation "run action 9"
> application-mattermost: 2020-06-24 14:38:51 DEBUG juju.machinelock machine
> lock released for uniter (run action 9)
> application-mattermost: 2020-06-24 14:38:51 DEBUG
> juju.worker.uniter.operation lock released
> application-mattermost: 2020-06-24 14:38:51 ERROR juju.worker.uniter
> resolver loop error: preparing operation "run action 9": cannot create
> runner for action "9": stat /var/lib/juju/agents/unit-mattermost-80/charm:
> no such file or directory
> application-mattermost: 2020-06-24 14:38:51 DEBUG juju.worker.uniter
> [AGENT-STATUS] failed: resolver loop error
> application-mattermost: 2020-06-24 14:38:51 INFO juju.worker.uniter unit
> "mattermost/80" shutting down: preparing operation "run action 9": cannot
> create runner for action "9": stat
> /var/lib/juju/agents/unit-mattermost-80/charm: no such file or directory
> application-mattermost: 2020-06-24 14:38:51 DEBUG juju.worker.uniter
> juju-run listener stopping
> application-mattermost: 2020-06-24 14:38:51 DEBUG juju.worker.uniter
> juju-run listener stopped
> application-mattermost: 2020-06-24 14:38:51 INFO juju.worker.caasoperator
> stopped "mattermost/80", err: preparing operation "run action 9": cannot
> create runner for action "9": stat
> /var/lib/juju/agents/unit-mattermost-80/charm: no such file or directory
> application-mattermost: 2020-06-24 14:38:51 DEBUG juju.worker.caasoperator
> "mattermost/80" done: preparing operation "run action 9": cannot create
> runner for action "9": stat /var/lib/juju/agents/unit-mattermost-80/charm:
> no such file or directory
> application-mattermost: 2020-06-24 14:38:51 ERROR juju.worker.caasoperator
> exited "mattermost/80": preparing operation "run action 9": cannot create
> runner for action "9": stat /var/lib/juju/agents/unit-mattermost-80/charm:
> no such file or directory
> application-mattermost: 2020-06-24 14:38:51 INFO juju.worker.caasoperator
> restarting "mattermost/80" in 3s
>
> I restarted the controller, and the unit dropped into failed:
>
> mattermost/80 active ...

Read more...

Revision history for this message
Ian Booth (wallyworld) wrote :

juju run is supported since 2.7 :-)

We need to figure out what that /var/lib/juju/agents/unit-mattermost-80/charm directory is missing. It's pretty fundamental as it's where the charm is unpacked.

Was there an upgrade-charm done after the original deployment?

What version of juju?

Pen Gale (pengale)
Changed in juju:
status: New → Incomplete
Revision history for this message
Launchpad Janitor (janitor) wrote :

[Expired for juju because there has been no activity for 60 days.]

Changed in juju:
status: Incomplete → Expired
To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.