While testing openstack-charms-next when executing ceph-mon action ceph-mon charm could not communicate with Juju.
This is the first time we saw this.
2020-10-13 14:48:06 DEBUG juju.worker.dependency engine.go:598 "hook-retry-strategy" manifold worker stopped: restart immediately
2020-10-13 14:48:06 DEBUG juju.worker.uniter runlistener.go:129 juju-run listener stopping
2020-10-13 14:48:06 DEBUG juju.worker.uniter runlistener.go:148 juju-run listener stopped
2020-10-13 14:48:07 DEBUG juju.worker.dependency engine.go:564 "hook-retry-strategy" manifold worker started at 2020-10-13 14:48:07.465930028 +0000 UTC
2020-10-13 14:48:09 DEBUG juju.worker.proxyupdater proxyupdater.go:168 applying in-process juju proxy settings proxy.Settings{Http:"http://squid.internal:3128/", Https:"http://squid.internal:3128/", Ftp:"", NoProxy:"10.0.0.0/8,10.246.64.200,10.246.64.201,10.246.64.202,127.0.0.1,172.16.0.0/12,192.168.0.0/16,localhost", AutoNoProxy:""}
2020-10-13 14:48:11 DEBUG juju.worker.uniter.operation executor.go:130 executing operation "run action 48"
2020-10-13 14:48:11 DEBUG juju.worker.uniter agent.go:20 [AGENT-STATUS] executing: running action juju-run
2020-10-13 14:48:12 DEBUG juju.machinelock machinelock.go:186 machine lock released for uniter (run action 48)
2020-10-13 14:48:12 DEBUG juju.worker.uniter.operation executor.go:113 lock released
2020-10-13 14:48:12 ERROR juju.worker.uniter agent.go:31 resolver loop error: executing operation "run action 48": read tcp 10.246.64.202:36182->10.246.64.200:37017: read: connection reset by peer
2020-10-13 14:48:12 DEBUG juju.worker.uniter agent.go:20 [AGENT-STATUS] failed: resolver loop error
2020-10-13 14:48:12 DEBUG juju.worker.dependency engine.go:598 "migration-minion" manifold worker stopped: txn watcher sync error
2020-10-13 14:48:12 ERROR juju.worker.dependency engine.go:671 "migration-minion" manifold worker returned unexpected error: txn watcher sync error
2020-10-13 14:48:12 DEBUG juju.worker.dependency engine.go:673 stack trace:
txn watcher sync error
/workspace/_build/src/github.com/juju/juju/rpc/client.go:178:
/workspace/_build/src/github.com/juju/juju/api/apiclient.go:1200:
2020-10-13 14:48:12 DEBUG juju.worker.dependency engine.go:598 "log-sender" manifold worker stopped: cannot send log message: write tcp 10.246.65.65:36142->10.246.64.202:17070: write: connection reset by peer
2020-10-13 14:48:12 ERROR juju.worker.dependency engine.go:671 "log-sender" manifold worker returned unexpected error: cannot send log message: write tcp 10.246.65.65:36142->10.246.64.202:17070: write: connection reset by peer
2020-10-13 14:48:12 DEBUG juju.worker.dependency engine.go:673 stack trace:
write tcp 10.246.65.65:36142->10.246.64.202:17070: write: connection reset by peer
/workspace/_build/src/github.com/juju/juju/api/client.go:683:
/workspace/_build/src/github.com/juju/juju/api/logsender/logsender.go:75: cannot send log message
/workspace/_build/src/github.com/juju/juju/worker/logsender/worker.go:69:
2020-10-13 14:48:12 DEBUG juju.worker.leadership tracker.go:227 ceph-mon/0 waiting for ceph-mon leadership release gave err: error blocking on leadership release: lease manager stopped
2020-10-13 14:48:12 DEBUG juju.worker.dependency engine.go:598 "leadership-tracker" manifold worker stopped: error while ceph-mon/0 waiting for ceph-mon leadership release: error blocking on leadership release: lease manager stopped
2020-10-13 14:48:12 ERROR juju.worker.dependency engine.go:671 "leadership-tracker" manifold worker returned unexpected error: error while ceph-mon/0 waiting for ceph-mon leadership release: error blocking on leadership release: lease manager stopped
2020-10-13 14:48:12 DEBUG juju.worker.dependency engine.go:673 stack trace:
lease manager stopped
/workspace/_build/src/github.com/juju/juju/api/leadership/client.go:61: error blocking on leadership release
/workspace/_build/src/github.com/juju/juju/worker/leadership/tracker.go:140: error while ceph-mon/0 waiting for ceph-mon leadership release
2020-10-13 14:48:12 DEBUG juju.worker.dependency engine.go:598 "hook-retry-strategy" manifold worker stopped: txn watcher sync error
2020-10-13 14:48:12 ERROR juju.worker.dependency engine.go:671 "hook-retry-strategy" manifold worker returned unexpected error: txn watcher sync error
2020-10-13 14:48:12 DEBUG juju.worker.dependency engine.go:673 stack trace:
txn watcher sync error
/workspace/_build/src/github.com/juju/juju/rpc/client.go:178:
/workspace/_build/src/github.com/juju/juju/api/apiclient.go:1200:
2020-10-13 14:48:12 DEBUG juju.worker.dependency engine.go:598 "meter-status" manifold worker stopped: txn watcher sync error
2020-10-13 14:48:12 ERROR juju.worker.dependency engine.go:671 "meter-status" manifold worker returned unexpected error: txn watcher sync error
2020-10-13 14:48:12 DEBUG juju.worker.dependency engine.go:673 stack trace:
txn watcher sync error
/workspace/_build/src/github.com/juju/juju/rpc/client.go:178:
/workspace/_build/src/github.com/juju/juju/api/apiclient.go:1200:
2020-10-13 14:48:12 DEBUG juju.worker.dependency engine.go:598 "api-address-updater" manifold worker stopped: txn watcher sync error
2020-10-13 14:48:12 ERROR juju.worker.dependency engine.go:671 "api-address-updater" manifold worker returned unexpected error: txn watcher sync error
2020-10-13 14:48:12 DEBUG juju.worker.dependency engine.go:673 stack trace:
txn watcher sync error
/workspace/_build/src/github.com/juju/juju/rpc/client.go:178:
/workspace/_build/src/github.com/juju/juju/api/apiclient.go:1200:
2020-10-13 14:48:12 DEBUG juju.worker.dependency engine.go:598 "upgrader" manifold worker stopped: txn watcher sync error
2020-10-13 14:48:12 ERROR juju.worker.dependency engine.go:671 "upgrader" manifold worker returned unexpected error: txn watcher sync error
Logs and crashdumps at https://oil-jenkins.canonical.com/artifacts/f8e0ec98-ccf6-4ecc-afb4-86e7ca491528/index.html
This looks like it might be a transient networking issue. Juju should probably be handling it better, though.
Work here involves digging through the logs to figure out what happened, and figuring out if retry or better error messaging would be appropriate.