microk8s crashes with "FAIL: Service snap.microk8s.daemon-apiserver is not running"
See original GitHub issueHello,
I have installed microk8s 3 node cluster, all works great for a a couple of days but then it crashes for no evident reason to apiserver FAILed
.
Below is microk8s inspect
output and attached tarball
inspection-report-20200925_103006.tar.gz
Inspecting Certificates
Inspecting services
Service snap.microk8s.daemon-cluster-agent is running
Service snap.microk8s.daemon-containerd is running
**FAIL: Service snap.microk8s.daemon-apiserver is not running**
For more details look at: sudo journalctl -u snap.microk8s.daemon-apiserver
Service snap.microk8s.daemon-apiserver-kicker is running
Service snap.microk8s.daemon-control-plane-kicker is running
Service snap.microk8s.daemon-proxy is running
Service snap.microk8s.daemon-kubelet is running
Service snap.microk8s.daemon-scheduler is running
Service snap.microk8s.daemon-controller-manager is running
Copy service arguments to the final report tarball
Inspecting AppArmor configuration
Gathering system information
Copy processes list to the final report tarball
Copy snap list to the final report tarball
Copy VM name (or none) to the final report tarball
Copy disk usage information to the final report tarball
Copy memory usage information to the final report tarball
Copy server uptime to the final report tarball
Copy current linux distribution to the final report tarball
Copy openSSL information to the final report tarball
Copy network configuration to the final report tarball
Inspecting kubernetes cluster
Inspect kubernetes cluster
Building the report tarball
Report tarball is at /var/snap/microk8s/1719/inspection-report-20200925_103006.tar.gz
This is not first time it has happened. My attempt to deploy a small prod cluster based on microk8s is hindered because of this problem in test environment
Issue Analytics
- State:
- Created 3 years ago
- Reactions:14
- Comments:151 (25 by maintainers)
Top Results From Across the Web
Service snap.microk8s.daemon-apiserver is not running" ...
I have installed microk8s 3 node cluster, all works great for a a couple of days but then it crashes for no evident...
Read more >Troubleshooting
If a pod is not behaving as expected, the first port of call should be the logs. First determine the resource identifier for...
Read more >Service snap.microk8s.daemon-apiserver is not running
MicroK8s V1.19 and running in three node with High Availability. How to troubleshoot and fix this error properly? snap list :
Read more >Kubernetes on an Ubuntu Machine (Single Node) - PrimeHub
ubuntu@foo_bar:~$ microk8s.status microk8s is not running. ... is running Service snap.microk8s.daemon-apiserver-kicker is running Service ...
Read more >system stability issues observed when running vRAN pods ...
Server hangs, cannot ssh, cannot access system... ... FAIL: Service snap.microk8s.daemon-containerd is not running
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Hi all,
We will be doing a release to the stable channels (latest/stable and 1.19/stable) this week.
This release will be addressing the segmentation faults and data corruption issues.
We are still investigating the memory leak.
Thank you for your patience.
The PRs with the fixes on the dqlite layer are in [1, 2]. Also, in the latest/edge, latest/beta and latest/candidate you will find a setup where Kubernetes services start as go routines instead of systemd processes resulting in further memory footprint gains.
The dqlite fixes are being backported to 1.19 and 1.20 and are already on the candidate channels. In the next couple of days these fixes will make it to the stable channels.
[1] https://github.com/canonical/raft/pull/168 [2] https://github.com/canonical/raft/pull/167