I tried to run `juju run --unit mattermost/80 -- state-get` on a k8s model just now, and it hung.
`juju debug-log` reveals the following:
application-mattermost: 2020-06-24 14:38:51 DEBUG juju.worker.uniter.remotestate got action change: [9] ok=true
application-mattermost: 2020-06-24 14:38:51 DEBUG juju.worker.uniter.operation running operation run action 9
application-mattermost: 2020-06-24 14:38:51 DEBUG juju.machinelock acquire machine lock for uniter (run action 9)
application-mattermost: 2020-06-24 14:38:51 DEBUG juju.machinelock machine lock acquired for uniter (run action 9)
application-mattermost: 2020-06-24 14:38:51 DEBUG juju.worker.uniter.operation preparing operation "run action 9"
application-mattermost: 2020-06-24 14:38:51 DEBUG juju.machinelock machine lock released for uniter (run action 9)
application-mattermost: 2020-06-24 14:38:51 DEBUG juju.worker.uniter.operation lock released
application-mattermost: 2020-06-24 14:38:51 ERROR juju.worker.uniter resolver loop error: preparing operation "run action 9": cannot create runner for action "9": stat /var/lib/juju/agents/unit-mattermost-80/charm: no such file or directory
application-mattermost: 2020-06-24 14:38:51 DEBUG juju.worker.uniter [AGENT-STATUS] failed: resolver loop error
application-mattermost: 2020-06-24 14:38:51 INFO juju.worker.uniter unit "mattermost/80" shutting down: preparing operation "run action 9": cannot create runner for action "9": stat /var/lib/juju/agents/unit-mattermost-80/charm: no such file or directory
application-mattermost: 2020-06-24 14:38:51 DEBUG juju.worker.uniter juju-run listener stopping
application-mattermost: 2020-06-24 14:38:51 DEBUG juju.worker.uniter juju-run listener stopped
application-mattermost: 2020-06-24 14:38:51 INFO juju.worker.caasoperator stopped "mattermost/80", err: preparing operation "run action 9": cannot create runner for action "9": stat /var/lib/juju/agents/unit-mattermost-80/charm: no such file or directory
application-mattermost: 2020-06-24 14:38:51 DEBUG juju.worker.caasoperator "mattermost/80" done: preparing operation "run action 9": cannot create runner for action "9": stat /var/lib/juju/agents/unit-mattermost-80/charm: no such file or directory
application-mattermost: 2020-06-24 14:38:51 ERROR juju.worker.caasoperator exited "mattermost/80": preparing operation "run action 9": cannot create runner for action "9": stat /var/lib/juju/agents/unit-mattermost-80/charm: no such file or directory
application-mattermost: 2020-06-24 14:38:51 INFO juju.worker.caasoperator restarting "mattermost/80" in 3s
I restarted the controller, and the unit dropped into failed:
mattermost/80 active failed 10.1.1.140 8065/TCP
(Note that I also have another bug, LP:1882600, that seems related to a missing ${unit_dir}/charm directory.)
'juju run' isn't supported in older k8s deployments, as there is no SSH
running for it to connect to. That said, hanging is a terrible experience,
and we should be failing fast rather than having it try but not work. I
think there is work going on to get 'juju run' working on K8s workloads,
but I don't think it is planned for 2.8.
On Tue, Jun 23, 2020 at 9:55 PM Paul Collins <email address hidden>
wrote:
> Public bug reported: mattermost: 2020-06-24 14:38:51 DEBUG uniter. remotestate got action change: [9] ok=true mattermost: 2020-06-24 14:38:51 DEBUG uniter. operation running operation run action 9 mattermost: 2020-06-24 14:38:51 DEBUG juju.machinelock acquire mattermost: 2020-06-24 14:38:51 DEBUG juju.machinelock machine mattermost: 2020-06-24 14:38:51 DEBUG uniter. operation preparing operation "run action 9" mattermost: 2020-06-24 14:38:51 DEBUG juju.machinelock machine mattermost: 2020-06-24 14:38:51 DEBUG uniter. operation lock released mattermost: 2020-06-24 14:38:51 ERROR juju.worker.uniter juju/agents/ unit-mattermost -80/charm: mattermost: 2020-06-24 14:38:51 DEBUG juju.worker.uniter mattermost: 2020-06-24 14:38:51 INFO juju.worker.uniter unit juju/agents/ unit-mattermost -80/charm: no such file or directory mattermost: 2020-06-24 14:38:51 DEBUG juju.worker.uniter mattermost: 2020-06-24 14:38:51 DEBUG juju.worker.uniter mattermost: 2020-06-24 14:38:51 INFO juju.worker. caasoperator juju/agents/ unit-mattermost -80/charm: no such file or directory mattermost: 2020-06-24 14:38:51 DEBUG juju.worker. caasoperator juju/agents/ unit-mattermost -80/charm: mattermost: 2020-06-24 14:38:51 ERROR juju.worker. caasoperator juju/agents/ unit-mattermost -80/charm: mattermost: 2020-06-24 14:38:51 INFO juju.worker. caasoperator
>
> I tried to run `juju run --unit mattermost/80 -- state-get` on a k8s
> model just now, and it hung.
>
> `juju debug-log` reveals the following:
>
> application-
> juju.worker.
> application-
> juju.worker.
> application-
> machine lock for uniter (run action 9)
> application-
> lock acquired for uniter (run action 9)
> application-
> juju.worker.
> application-
> lock released for uniter (run action 9)
> application-
> juju.worker.
> application-
> resolver loop error: preparing operation "run action 9": cannot create
> runner for action "9": stat /var/lib/
> no such file or directory
> application-
> [AGENT-STATUS] failed: resolver loop error
> application-
> "mattermost/80" shutting down: preparing operation "run action 9": cannot
> create runner for action "9": stat
> /var/lib/
> application-
> juju-run listener stopping
> application-
> juju-run listener stopped
> application-
> stopped "mattermost/80", err: preparing operation "run action 9": cannot
> create runner for action "9": stat
> /var/lib/
> application-
> "mattermost/80" done: preparing operation "run action 9": cannot create
> runner for action "9": stat /var/lib/
> no such file or directory
> application-
> exited "mattermost/80": preparing operation "run action 9": cannot create
> runner for action "9": stat /var/lib/
> no such file or directory
> application-
> restarting "mattermost/80" in 3s
>
> I restarted the controller, and the unit dropped into failed:
>
> mattermost/80 active ...